3
好像我能得到通過使num_samples做大保持nb_epoch = 1完全相同的結果。我認爲多個曆元的目的是相同的數據多次迭代,但Keras並不在每個曆元的末尾重新實例化發生器。它只是繼續前進。例如培訓本的自動編碼:什麼nb_epoch的Keras的fit_generator的目的是什麼?
import numpy as np
from keras.layers import (Convolution2D, MaxPooling2D,
UpSampling2D, Activation)
from keras.models import Sequential
rand_imgs = [np.random.rand(1, 100, 100, 3) for _ in range(1000)]
def keras_generator():
i = 0
while True:
print(i)
rand_img = rand_imgs[i]
i += 1
yield (rand_img, rand_img)
layers = ([
Convolution2D(20, 5, 5, border_mode='same',
input_shape=(100, 100, 3), activation='relu'),
MaxPooling2D((2, 2), border_mode='same'),
Convolution2D(3, 5, 5, border_mode='same', activation='relu'),
UpSampling2D((2, 2)),
Convolution2D(3, 5, 5, border_mode='same', activation='relu')])
autoencoder = Sequential()
for layer in layers:
autoencoder.add(layer)
gen = keras_generator()
autoencoder.compile(optimizer='adadelta', loss='binary_crossentropy')
history = autoencoder.fit_generator(gen, samples_per_epoch=100, nb_epoch=2)
好像我得到同樣的結果(samples_per_epoch = 100,nb_epoch = 2),I代表做(samples_per_epoch = 200,nb_epoch = 1)。我是否按照預期使用fit_generator?