4
我想知道如何在Tensorflow中使用多層雙向LSTM。如何在Tensorflow中使用多層雙向LSTM?
我已經實現了雙向LSTM的內容,但是我想將這個模型與添加了多層的模型進行比較。
我該如何在這部分添加一些代碼?
x = tf.unstack(tf.transpose(x, perm=[1, 0, 2]))
#print(x[0].get_shape())
# Define lstm cells with tensorflow
# Forward direction cell
lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Backward direction cell
lstm_bw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Get lstm cell output
try:
outputs, _, _ = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)
except Exception: # Old TensorFlow version only returns outputs not states
outputs = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)
# Linear activation, using rnn inner loop last output
outputs = tf.stack(outputs, axis=1)
outputs = tf.reshape(outputs, (batch_size*n_steps, n_hidden*2))
outputs = tf.matmul(outputs, weights['out']) + biases['out']
outputs = tf.reshape(outputs, (batch_size, n_steps, n_classes))
我試過這個,得到這個錯誤:ValueError:變量bidirectional_rnn/fw/lstm_cell/kernel已經存在,不允許。你是否想在VarScope中設置reuse = True?你能提供一個有效的例子嗎? – Rahul