2017-10-20 61 views

回答

3

只是定義了兩個優化,並在它們之間切換:

sgd_optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost) 
adap_optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost) 
... 
for epoch in range(100): 
    for (x, y) in zip(train_X, train_Y): 
    optimizer = sgd_optimizer if epoch > 50 else adap_optimizer 
    sess.run(optimizer, feed_dict={X: x, Y: y}) 

優化器僅封裝的梯度應用到張量的方式,可容納短短自己的變量。模型權重不存儲在優化器中,因此您可以輕鬆切換它們。