標籤:

tf 三

1)TF常用函數

batch_size = tf.size(labels)

labels = tf.expand_dims(labels,1)

indices = tf.expand_dims(tf.range(0,batch_size,1),1)

concated = tf.concat(1,[indices,labels])

onehot_labels = tf.sparse_to_dense(concated,tf.pack([batch_size,NUM_CLASSES]),1.0,0.0)

2)輸出結果,使用SumaryWriter生成匯總值(summary values)

tf.summary.scalar(name,op)

summary_op = tf.summary.merge_all()

創建好session後,實例化一個tf.summary.FileWriter

summary_writer = tf.summary.FileWriter(,graph=sess.graph_def)

每次運行一個summary_op,都會往事件文件中寫入最新的即時數據

summary_str = sess.run(summary_op,feed_dict=feed_dict)

summary_writer.add_summary(summary_str,step)

summary_writer.flush()

summary_writer.close()

3)圖表

with tf.Graph().as_default():

_,loss_value = sess.run([train_op,loss],feed_dict=feed_dict)

4)保存檢查點

saver = tf.train.Saver()

saver.save(sess,dri.step)

saver.restore(sess,dir)


推薦閱讀:

請問batch_normalization做了normalization後為什麼要變回來?
TensorFlow的Summary
TensorFlow的checkpoint中變數的重命名
班主任在窗戶後和公車上有人偷看你手機哪個更可怕一些?

TAG:TensorFlow |