2017-08-02 83 views
0

我想用落後的累積和功能:Tensorflow - 用張量作爲指標

def _backwards_cumsum(x, length, batch_size): 

upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
return tf.matmul(repeated_tri, 
        tf.reshape(x, [length, 1])) 

但是長度是一個佔位符:

length = tf.placeholder("int32" ,name = 'xx') 

所以每次它得到一個新的價值然後開始計算_backwards_cumsum。

一旦嘗試運行的功能,我得到了一個錯誤:

TypeError: 'Tensor' object cannot be interpreted as an index 

完整回溯:

{ 
TypeError         Traceback (most recent call last) 
<ipython-input-561-970ae9e96aa1> in <module>() 
----> 1 rewards = _backwards_cumsum(tf.reshape(tf.reshape(decays,[-1,1]) * tf.sigmoid(disc_pred_gen_ph), [-1]), _maxx, batch_size) 

<ipython-input-546-5c6928fac357> in _backwards_cumsum(x, length, batch_size) 
     1 def _backwards_cumsum(x, length, batch_size): 
     2 
----> 3  upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
     4  repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
     5  return tf.matmul(repeated_tri, 

/Users/onivron/anaconda/envs/tensorflow/lib/python2.7/site-packages/numpy/core/numeric.pyc in ones(shape, dtype, order) 
    190 
    191  """ 
--> 192  a = empty(shape, dtype, order) 
    193  multiarray.copyto(a, 1, casting='unsafe') 
    194  return a 

哪裏_maxx是與上面相同長度的佔位符。

任何解決方法呢?

+0

如果沒有完整的回溯,很難說清楚。它是Python解釋器錯誤? TensorFlow運行時錯誤? etc –

回答

1

該錯誤與您在不知情情況下用於numpy數組的張量對象有關:length。在tensorflow中使用numpy功能的最好方法是使用tf.py_func

# Define a new function that only depends on numpy/any non tensorflow graph object 

def get_repeated_tri(length, batch_size): 
    upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
    repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
    return repeated_tri 
# Here length and batch size must be non tensor object 
repeated_tri = tf.py_func(get_repeated_tri, [length, batch_size], tf.int32) 
# there're some size mismacthes also in your code `tf.matmul` 
def _backwards_cumsum(repeated_tri, x, length_, batch_size): 
    return tf.matmul(repeated_tri, tf.reshape(x, [length_*batch_size, -1])) 
length_ = tf.placeholder(tf.int32, name='length') 
# also define length, batch_size as nump constants 
# x as tensorflow tensor 
some_tensor_out= _backwards_cumsum(repeated_tri, x, length_, batch_size) 

some_tensor_out_ = sess.run(some_tensor_out, {length_:length}) 
+0

主要問題是,我想有不同的長度影響repeated_tri。 – moreo

+0

那麼是什麼問題,這段代碼支持不同的長度。 –