2017-08-29 61 views
1

我試圖實施基於stanford在首次轉讓給cs224n時所使用的支架的SGD。該實現是在Python中。支架如下:爲隨機梯度下降添加成本的目的

def load_saved_params(): 
'''A helper function that loads previously saved parameters and resets 
iteration start.''' 
return st, params, state #st = starting iteration 

def save_params(iter, params): 
'''saves the parameters''' 

,現在的主要功能(我按照與多個哈希符號感興趣的語句)

def sgd(f, x0, step, iterations, postprocessing=None, useSaved=False, 
    PRINT_EVERY=10): 
""" Stochastic Gradient Descent 

Implement the stochastic gradient descent method in this function. 

Arguments: 
f -- the function to optimize, it should take a single 
    argument and yield two outputs, a cost and the gradient 
    with respect to the arguments 
x0 -- the initial point to start SGD from 
step -- the step size for SGD 
iterations -- total iterations to run SGD for 
postprocessing -- postprocessing function for the parameters 
        if necessary. In the case of word2vec we will need to 
        normalize the word vectors to have unit length. 
PRINT_EVERY -- specifies how many iterations to output loss 

Return: 
x -- the parameter value after SGD finishes 
""" 

# Anneal learning rate every several iterations 
ANNEAL_EVERY = 20000 

if useSaved: 
    start_iter, oldx, state = load_saved_params() 
    if start_iter > 0: 
     x0 = oldx 
     step *= 0.5 ** (start_iter/ANNEAL_EVERY) 

    if state: 
     random.setstate(state) 
else: 
    start_iter = 0 

x = x0 

if not postprocessing: 
    postprocessing = lambda x: x 

expcost = None ###################################################### 

for iter in xrange(start_iter + 1, iterations + 1): 
    # Don't forget to apply the postprocessing after every iteration! 
    # You might want to print the progress every few iterations. 

    cost = None 

    ### END YOUR CODE 

    if iter % PRINT_EVERY == 0: 
     if not expcost: 
      expcost = cost 
     else: 
      expcost = .95 * expcost + .05 * cost ######################## 
     print "iter %d: %f" % (iter, expcost) 

    if iter % SAVE_PARAMS_EVERY == 0 and useSaved: 
     save_params(iter, x) 

    if iter % ANNEAL_EVERY == 0: 
     step *= 0.5 

return x 

我的目的,我有沒有用expcost的。但代碼中expcost的用途是什麼。在什麼情況下可以使用?爲什麼它用於修改由成本函數計算的成本?

回答

1

如果您注意,expcost僅用於打印成本。這只是一種平滑成本函數的方法,因爲它可以明顯跳躍到批處理,儘管模型的改進完全有意義,但該模型的改進爲

+1

。謝謝。如果明天我還沒有答案指出其他一些更密切的用途,那麼我不能接受答案 – Nitin