我有以下功能:InvalidArgumentError上SOFTMAX在tensorflow
def forward_propagation(self, x):
# The total number of time steps
T = len(x)
# During forward propagation we save all hidden states in s because need them later.
# We add one additional element for the initial hidden, which we set to 0
s = tf.zeros([T+1, self.hidden_dim])
# The outputs at each time step. Again, we save them for later.
o = tf.zeros([T, self.word_dim])
a = tf.placeholder(tf.float32)
b = tf.placeholder(tf.float32)
c = tf.placeholder(tf.float32)
s_t = tf.nn.tanh(a + tf.reduce_sum(tf.multiply(b, c)))
o_t = tf.nn.softmax(tf.reduce_sum(tf.multiply(a, b)))
# For each time step...
with tf.Session() as sess:
s = sess.run(s)
o = sess.run(o)
for t in range(T):
# Note that we are indexing U by x[t]. This is the same as multiplying U with a one-hot vector.
s[t] = sess.run(s_t, feed_dict={a: self.U[:, x[t]], b: self.W, c: s[t-1]})
o[t] = sess.run(o_t, feed_dict={a: self.V, b: s[t]})
return [o, s]
self.U,self.V,和self.W是numpy的陣列。我試圖讓SOFTMAX上
o_t = tf.nn.softmax(tf.reduce_sum(tf.multiply(a, b)))
圖,它給我的錯誤在這條線:
o[t] = sess.run(o_t, feed_dict={a: self.V, b: s[t]})
的錯誤是:
InvalidArgumentError (see above for traceback): Expected begin[0] == 0 (got -1) and size[0] == 0 (got 1) when input.dim_size(0) == 0
[[Node: Slice = Slice[Index=DT_INT32, T=DT_INT32, _device="/job:localhost/replica:0/task:0/cpu:0"](Shape_1, Slice/begin, Slice/size)]]
如何,我應該得到SOFTMAX在tensorflow?
我正在嘗試製作a和b的點生成。 – yusuf
在這種情況下,您應該使用'tf.matmul'(如果兩個參數都是矩陣),或者您必須指定要求和的軸。例如,如果'a'具有形狀'(n,k)'而'b'具有形狀'(k,)',則可以使用'tf.reduce_sum(a * b,axis = 1)'計算點積。 –
tm matmul給了我這個錯誤:形狀必須是等級2,但是'MatMul'(op:'MatMul')的等級爲1,輸入形狀爲[8000,100],[100]。我已經使用了o [t] .assign(tf.nn.softmax(tf.matmul(self.V,s [t]))) – yusuf