1. 程式人生 > >tensorflow計算模型資料模型會話模型與簡單神經網路樣例

tensorflow計算模型資料模型會話模型與簡單神經網路樣例

**

1.tf計算模型——計算圖

**
在這裡插入圖片描述
#定義兩個不同的圖,說明不同圖中張量無關

import tensorflow as tf

g1 = tf.Graph()
with g1.as_default():
    v = tf.get_variable("v", [1], initializer = tf.zeros_initializer) # 設定初始值為0

g2 = tf.Graph()
with g2.as_default():
    v = tf.get_variable("v", [1], initializer = tf.ones_initializer())  # 設定初始值為1
    
with tf.Session(graph = g1) as sess:
    tf.global_variables_initializer().run()
    with tf.variable_scope("", reuse=True):
        print(sess.run(tf.get_variable("v")))

with tf.Session(graph = g2) as sess:
    tf.global_variables_initializer().run()
    with tf.variable_scope("", reuse=True):
        print(sess.run(tf.get_variable("v")))

**

2.tf資料模型——張量

**
在這裡插入圖片描述

import tensorflow as tf
a = tf.constant([1.0, 2.0], name="a")
b = tf.constant([2.0, 3.0], name="b")
result = a + b
print result

sess = tf.InteractiveSession ()
print(result.eval())
sess.close()

在這裡插入圖片描述
*

3.tf執行模型——會話*

在這裡插入圖片描述

sess = tf.Session()
with sess.as_default():
     print(result.eval())

在這裡插入圖片描述

4.tf實現完整的神經網路

import tensorflow as tf
from numpy.random import RandomState
#定義神經網路的引數,輸入和輸出節點。
batch_size = 8
w1= tf.Variable(tf.random_normal([2, 3], stddev=1, seed=1))
w2= tf.Variable(tf.random_normal([3, 1], stddev=1, seed=1))
x = tf.placeholder(tf.float32, shape=(None, 2), name="x-input")
y_= tf.placeholder(tf.float32, shape=(None, 1), name='y-input')
#定義前向傳播過程,損失函式及反向傳播演算法
a = tf.matmul(x, w1)
y = tf.matmul(a, w2)
cross_entropy = -tf.reduce_mean(y_ * tf.log(tf.clip_by_value(y, 1e-10, 1.0))) 
train_step = tf.train.AdamOptimizer(0.001).minimize(cross_entropy)
#生成模擬資料集
rdm = RandomState(1)
X = rdm.rand(128,2)
Y = [[int(x1+x2 < 1)] for (x1, x2) in X]
#建立一個會話來執行TensorFlow程式。
with tf.Session() as sess:
    init_op = tf.global_variables_initializer()
    sess.run(init_op)
    
    # 輸出目前(未經訓練)的引數取值。
    print "w1:", sess.run(w1)
    print "w2:", sess.run(w2)
    print "\n"
    
    # 訓練模型。
    STEPS = 5000
    for i in range(STEPS):
        start = (i*batch_size) % 128
        end = (i*batch_size) % 128 + batch_size
        sess.run(train_step, feed_dict={x: X[start:end], y_: Y[start:end]})
        if i % 1000 == 0:
            total_cross_entropy = sess.run(cross_entropy, feed_dict={x: X, y_: Y})
            print("After %d training step(s), cross entropy on all data is %g" % (i, total_cross_entropy))
    
    # 輸出訓練後的引數取值。
    print "\n"
    print "w1:", sess.run(w1)
    print "w2:", sess.run(w2)



 #結果如下
    w1: [[-0.81131822  1.48459876  0.06532937]
 [-2.44270396  0.0992484   0.59122431]]
w2: [[-0.81131822]
 [ 1.48459876]
 [ 0.06532937]]


After 0 training step(s), cross entropy on all data is 0.0674925
After 1000 training step(s), cross entropy on all data is 0.0163385
After 2000 training step(s), cross entropy on all data is 0.00907547
After 3000 training step(s), cross entropy on all data is 0.00714436
After 4000 training step(s), cross entropy on all data is 0.00578471


w1: [[-1.9618274   2.58235407  1.68203783]
 [-3.4681716   1.06982327  2.11788988]]
w2: [[-1.8247149 ]
 [ 2.68546653]
 [ 1.41819501]]

5.pycharm
pycharm開啟方法:cd pycharm/bin
source pycharm.sh
.pynb的開啟及jupyter的安裝:https://blog.csdn.net/gamedr/article/details/72803234