2017-05-28 63 views
13

使用tf.layers中定義的圖層時,是否可以添加L2正則化?使用高級別tf.layers時添加L2正則化

在我看來,由於tf.layers是一個高層包裝,沒有簡單的方法可以訪問過濾器權重。

隨着tf.nn.conv2d

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1) 

weights = tf.get_variable(
    name="weights", 
    regularizer=regularizer 
) 

#Previous layers 

... 

#Second layer 
layer 2 = tf.nn.conv2d(
input, 
weights, 
[1,1,1,1], 
[1,1,1,1]) 

#More layers 
... 

#Loss 
loss = #some loss 

reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES) 
reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables) 
loss += reg_term 

現在會是什麼樣子與tf.layers.conv2d?

謝謝!

回答

15

您可以將它們傳遞到tf.layers.conv2d as arguments:

regularizer = tf.contrib.layers.l2_regularizer(scale=0.1) 
layer2 = tf.layers.conv2d(
    inputs, 
    filters, 
    kernel_size, 
    kernel_regularizer=regularizer) 
+6

我需要將正規化劑添加到最後的損失層嗎?像'loss_new = loss_old + regularizer' – Tom

+0

@TYL是否將它添加到最後一個丟失層? – thigi

+0

你能否擴大你的答案? – thigi

4

是不是在你的問題的答案?您也可以使用tf.losses.get_regularization_loss(https://www.tensorflow.org/api_docs/python/tf/losses/get_regularization_loss),它將收集所有REGULARIZATION_LOSSES。

... 
layer2 = tf.layers.conv2d(input, 
    filters, 
    kernel_size,       
    kernel_regularizer= tf.contrib.layers.l2_regularizer(scale=0.1)) 
... 
l2_loss = tf.losses.get_regularization_loss() 
loss += l2_loss