之間的差異我有2個模型就是我2點基本相同的數據集自制訓練。一個與序列長度1和一個與序列長度2.在它conveges像一個魅力和practicaly第一種情況下計算出我的產生過程,它比較好一點的機會的第二種情況。我做錯了什麼?任何事情都可能有幫助。Keras順序timedistributed模型極值結果2個3序列
數據生成代碼
def make_other_date(samples = 720,sequence = 1, features =100):
y_train = np.zeros((samples,sequence, 2))
x_train = np.random.randint(2, size=(samples, sequence, features))
for i_sample in range(samples):
for i_sequence in range(sequence):
if np.sum(x_train[i_sample,i_sequence,:]) > 50:
y_train[i_sample,:,:] = np.array([0,1])
else:
y_train[i_sample,:,:] = np.array([1,0])
return x_train-0.5,y_train #-0.5 to make mean = 0
nsequence = 1
x_train, y_train = make_other_date(36000,sequence = nsequence)
x_val, y_val = make_other_date(360,sequence = nsequence)
print(x_train.shape,y_train.shape)#(36000, 1, 100) (36000, 1, 2)
模型
model = Sequential()
model.add(TimeDistributed(Dense(10), batch_input_shape=(None,nsequence,100)))
model.add(TimeDistributed(Dense(10))) #unnessacery
model.add(TimeDistributed(Dense(2)))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam')
print (model.output_shape) #(None, 1, 2)
結果nsequence = 1
Epoch 10/10
28800/28800 [==============================] - 3s - loss: 3.4264e-05 - val_loss: 2.4744e-05
結果nsequence = 2
Epoch 10/10
28800/28800 [==============================] - 3s - loss: 0.6053 - val_loss: 0.6042
是的,我想我明白謝謝你。我的目標是在每個'i_sequence'上改變它,然後我(我理解它的方式)不需要模型的遞歸函數。當然,現在我有這個我想把它改爲'i_sequence-1',然後我需要經常性函數。然而,我測試了它,你的權利非常感謝你! – NeoTT