2017-03-16 71 views
0

我從here閱讀建議始終使用tf.get_variable(...)雖然這似乎有點麻煩,當我試圖實現一個網絡。我正確使用tf.get_variable()嗎?

例如:

def create_weights(shape, name = 'weights',\ 
        initializer = tf.random_normal_initializer(0, 0.1)): 
    weights = tf.get_variable(name, shape, initializer = initializer) 
    print("weights created named: {}".format(weights.name)) 
    return(weights) 

def LeNet(in_units, keep_prob): 

    # define the network 
    with tf.variable_scope("conv1"): 
     conv1 = conv(in_units, create_weights([5, 5, 3, 32]), create_bias([32])) 
     pool1 = maxpool(conv1) 

    with tf.variable_scope("conv2"): 
     conv2 = conv(pool1, create_weights([5, 5, 32, 64]), create_bias([64])) 
     pool2 = maxpool(conv2) 

    # reshape the network to feed it into the fully connected layers 
    with tf.variable_scope("flatten"): 
     flatten = tf.reshape(pool2, [-1, 1600]) 
     flatten = dropout(flatten, keep_prob) 

    with tf.variable_scope("fc1"): 
     fc1 = fc(flatten, create_weights([1600, 120]), biases = create_bias([120])) 
     fc1 = dropout(fc1, keep_prob) 

    with tf.variable_scope("fc2"): 
     fc2 = fc(fc1, create_weights([120, 84]), biases = create_bias([84])) 

    with tf.variable_scope("logits"): 
     logits = fc(fc2, create_weights([84, 43]), biases = create_bias([43])) 

    return(logits) 

我不得不使用with tf_variable_scope(...)每一次我打電話create_weights,而且,說如果我想改變conv1變量的權重[7, 7, 3, 32]代替[5, 5, 3, 32]我將不得不重新啓動內核作爲變量已經存在。另一方面,如果我使用tf.Variable(...)我不會有任何這些問題。

我使用tf.variable_scope(...)不正確?

回答

0

看來你不能改變已經存在的變量範圍,因此只有當您重新啓動的內核,你可以改變你之前定義的變量。(實際上你創建一個新的,因爲前一個已刪除)

...

,這只是我的猜測......我會很感激,如果有人可以給一個詳細的解答。