3

任何想法如何在火焰中實現Caffe中的空間反射填充?Caffe中的空間反射填充

(x): nn.SpatialReflectionPadding(l=1, r=1, t=1, b=1) 
    (x): nn.SpatialConvolution(64 -> 64, 3x3) 
    (x): nn.ReLU 

回答

2

這樣做的一種方法是使用Caffe的Python Layer。然後您可以自己設置功能並根據您的需要進行自定義。但是,該層只能在CPU中運行,因此可能會減慢模型速度,特別是在網絡中間使用時。

在下文中,我已經定義使用Python層,它可以修改,以滿足您的需求爲零墊輸入層:

import caffe 
import numpy as np 

class SpatialReflectionPadding(caffe.Layer): 

def setup(self,bottom,top): 
    if len(bottom) != 1: # check that a single bottom blob is given 
     raise Exception("Expected a single blob")  
    if len(bottom[0].shape) != 4: # check that it is 4D 
     raise Exception("Expected 4D blob") 
    params = eval(self.param_str) # get the params given in the prototxt 
    self.l = params["l"] 
    self.r = params["r"] 
    self.t = params["t"] 
    self.b = params["b"] 

def reshape(self,bottom,top): 
    top[0].reshape(bottom[0].shape[0],bottom[0].shape[1],bottom[0].shape[2]+self.t+self.b,bottom[0].shape[3]+self.r+self.l) # set the shape of the top blob based on the shape of the existing bottom blob 

def forward(self,bottom,top): 
    for i in range(0,top[0].shape[2]): 
     for j in range(0,top[0].shape[3]): 
      if (i < self.t or i >= self.t+bottom[0].shape[2]) or (j < self.l or j >= self.l+bottom[0].shape[3]): 
       top[0].data[:,:,i,j] = 0 # for the padded part, set the value to 0 
      else: 
       top[0].data[:,:,i,j] = bottom[0].data[:,:,i-self.t,j-self.l] # for the rest, copy the value from the bottom blob 

def backward(self,top,propagate_down,bottom): 
    bottom[0].diff[...] = np.full(bottom[0].shape,1) * top[0].diff[:,:,self.t:self.t+bottom[0].shape[2],self.l:self.l+bottom[0].shape[3]] # set the gradient for backward pass 

然後,在你的prototxt文件,你可以使用它作爲:

layer { 
    name: "srp" # some name 
    type: "Python" 
    bottom: "some_layer" # the layer which provides the input blob 
    top: "srp" 
    python_param { 
     module: "caffe_srp" # whatever is your module name 
     layer: "SpatialReflectionPadding" 
     param_str: '{ "l": 1, "b": 1, "t": 1, "r": 1}' 
    } 
} 

我不是100%確定它能正常工作,但是當我使用它時,它似乎這樣做。無論如何,它應該給出一個想法和起點,說明如何進行。另外,你可以參考this question and its answers

+1

在GPU層中間使用CPU層實際上會殺死caffe性能。我去過那兒。如果你必須花時間來實現它。 – Shai