CS 20SI Lecture1: Overview of TensorFlow

Lecture1: Overview of TensorFlow

1.TensorFlow是什麼?

利用數據流圖進行數學計算的開源軟體庫。

TensorFlow提供了一整套完整的函數和類,用戶可以建立各種模型。

Open source software library for numerical computation using data flow graphs;

TensorFlow provides an extensive suite of functions and classes that allow users to

build various models from scratch.

2.為什麼選擇TensorFlow?

a)Python API;

b)Portability: deploy computation to one or more CPUs or GPUs in a desktop,

server, or mobile device with a single API;

c)Flexibility: from Raspberry Pi, Android, Windows, iOS, Linux to server farms;

d)Visualization (TensorBoard is da bomb);

e)Checkpoints (for managing experiments);

f)Auto-differentiation autodiff (no more taking derivatives by hand. Yay);

g)Large community (> 10,000 commits and > 3000 TF-related repos in 1 year);

h)Awesome projects already using TensorFlow.

3.課程目的

理解TensorFlow的計算圖方法;Understand TF』s computation graph approach;

能夠開發TF的內建函數;Explore TF』s built-in functions;

學會如何去構建適合深度學習項目的模型Learn how to build and structure models best suited for a deep learning project.

4.簡化TensorFlow.

1.TF learn(tf.contrib.learn)

TF有簡化的介面,TF learn提供了用戶可以簡單調用的可用模型。是模仿scikit learn。

TF Learn allows you to load in data, construct a model, fit your model using the training data, evaluate the accuracy, each using a single line.

2.TF-Slim(tf.contrib.slim)

Another simple API called TF-Slim to simplify building, training and evaluating neural

networks.

3.High level APIs on top tensorflow like Keras,TFLearn and Pretty Tensor.

注意區分TFLearn和TF learn。TFLearn supports most of recent deep learning models, such as ConvNets, LSTM, BiRNN, ResNets, Generative networks and features such as BatchNorm,PReLU.

5.數據流圖Data Flow Graphs.

TensorFlow separates definition of computations from their execution

將計算定義與執行分離。

第一步:assemble a graph. 生成構建圖

第二步: use a session to excute operations in the graph.生成會話執行圖中操作。

例子:

import tensorflow as tf

a = tf.add(2,3)

a #得到結果不是5

<tf.Tensor Add:0 shape=() dtype=int32>

怎樣獲得a的值呢?

創建一個session,運行圖得到值。

Create a session, assign it to variable sess so we can call it later

Within the session, evaluate the graph to fetch the value of a

sess = tf.Session()

print (sess.run(a))

sess.close

5

6.什麼是張量.Tensor

就是一個n維矩陣。

0-d tensor:標量(數字)

1-d tensor:vector(向量)

2-d tensor:matrix(矩陣)

and so on

7. Visualized by TensorBoard --> x,y?

當不明確給節點命名時,TF自動給節點命名。

Nodes: operators, variables, and constants

Edges: tensors

節點表示的是操作符、變數或者是常量。

邊表示的是張量。

8.

Possible to break graphs into several chunks and

run them parallelly across multiple CPUs, GPUs, or devices

可以將圖分成塊,並且並行運行在多個CPUs,GPUs或者其他設備。

with tf.device(/gpu:1")

9.What if I want to build more than one graph?

可是可以,但最好不要這麼做。多個圖需要多個會話,每一個都是greddy,想用盡所有資源且數據不能共享。

最好用互相不連接的子圖in one graph.

10.why graphs?

1.Save computation (only run subgraphs that lead to the values you want to fetch);

2.Break computation into small, differential pieces to facilitates auto-differentiation;

3.Facilitate distributed computation, spread the work across multiple CPUs, GPUs, or devices;

4.Many common machine learning models are commonly taught and visualized as directed graphs already.

推薦閱讀:

TAG:TensorFlow | 深度學習DeepLearning |