下面的代碼我已經寫了,失敗在self.optimizer.compute_gradients(self.output,all_variables)
tensorflow中optimizer.compute_gradient()和tf.gradients()之間的區別是什麼?
import tensorflow as tf
import tensorlayer as tl
from tensorflow.python.framework import ops
import numpy as np
class Network1():
def __init__(self):
ops.reset_default_graph()
tl.layers.clear_layers_name()
self.sess = tf.Session()
self.optimizer = tf.train.AdamOptimizer(learning_rate=0.1)
self.input_x = tf.placeholder(tf.float32, shape=[None, 784],name="input")
input_layer = tl.layers.InputLayer(self.input_x)
relu1 = tl.layers.DenseLayer(input_layer, n_units=800, act = tf.nn.relu, name="relu1")
relu2 = tl.layers.DenseLayer(relu1, n_units=500, act = tf.nn.relu, name="relu2")
self.output = relu2.all_layers[-1]
all_variables = relu2.all_layers
self.gradient = self.optimizer.compute_gradients(self.output,all_variables)
init_op = tf.initialize_all_variables()
self.sess.run(init_op)
與警告,
TypeError: Argument is not a tf.Variable: Tensor("relu1/Relu:0", shape=(?, 800), dtype=float32)
然而,當我把上面一行tf.gradients(self.output,all_variables)
,代碼工作正常,至少沒有報告警告。我在哪裏錯過了,因爲我認爲這兩個方法實際上是在執行相同的事情,即返回(漸變,變量)對的列表。
'tensorlayers'是什麼?我們有'tf.contrib.layers'。 – drpng