1
我正在使用batch_size = 100和n_units = 74。當下面的代碼運行時,rnn_state_fw
返回(1,2,100,74)。我可以理解100是用於batch_size,74是用於state_size,但是1和2是指什麼?tensorflow雙向lstm狀態維
forward_cell = tf.contrib.rnn.DropoutWrapper(tf.contrib.rnn.LSTMCell(hidden_size,initializer=tf.random_uniform_initializer(-1.0,1.0),state_is_tuple=True),input_keep_prob=self.dropout_keep_prob_lstm_input,output_keep_prob=self.dropout_keep_prob_lstm_output)
backward_cell = tf.contrib.rnn.DropoutWrapper(tf.contrib.rnn.LSTMCell(hidden_size,initializer=tf.random_uniform_initializer(-1.0,1.0),state_is_tuple=True),input_keep_prob=self.dropout_keep_prob_lstm_input,output_keep_prob=self.dropout_keep_prob_lstm_output)
forward_cell = tf.contrib.rnn.MultiRNNCell([forward_cell for _ in range(num_layers)],state_is_tuple=True)
backward_cell = tf.contrib.rnn.MultiRNNCell([backward_cell for _ in range(num_layers)],state_is_tuple=True)
initial_forward_state = forward_cell.zero_state(self.batch_size, tf.float32)
initial_backward_state = backward_cell.zero_state(self.batch_size, tf.float32)
rnn_output, rnn_state_fw,rnn_state_bw = tf.contrib.rnn.static_bidirectional_rnn(forward_cell,backward_cell, rnn_input,initial_state_fw=initial_forward_state,initial_state_bw=initial_backward_state,sequence_length=self.seq_lengths)
感謝您的解釋。我會通過鏈接。 – LCP