2016-02-13 32 views
3

是可能的寫在pycaffe一個CAFFE prototxt與來自如下的HDF5數據層:如何使用pycaffe編寫帶有內存數據層的網絡?

import caffe 
from caffe import layers as L 

def logreg(hdf5, batch_size): 
    n = caffe.NetSpec() 
    n.data, n.label = L.HDF5Data(batch_size = batch_size, source = hdf5, ntop = 2) 
    n.ip1 = L.InnerProduct(n.data, num_output = 2, weight_filler = dict(type='xavier')) 
    n.accuracy = L.Accuracy(n.ip1, n.label) 
    n.loss = L.SoftmaxWithLoss(n.ip1, n.label) 
    return n.to_proto() 

with open('models/logreg_auto_train.prototxt', 'w') as f: 
    f.write(str(logreg('data/train.txt', chunck_size))) 

是否有可能使用類似的方法來編寫具有存儲器數據層a prototxt?

回答

3

嘗試是這樣的:

import caffe 
from caffe import layers as L 

def logreg(height, width, channels, batch_size): 
    n = caffe.NetSpec() 
    n.data = L.MemoryData(batch_size = batch_size, height = height, width = width, channels = channels) 
    n.ip1 = L.InnerProduct(n.data, num_output = 2, weight_filler = dict(type='xavier')) 
    return n.to_proto() 

with open('models/logreg_memdata.prototxt', 'w') as f: 
    f.write(str(logreg(128,128,3, chunck_size))) 
+0

謝謝,我必須保持'n.label'聲明能以簡單的方式重量從以前的模型學會加載。那不應該改變任何事情嗎? –

+0

嗯,我沒有看到'n.label'中的加載會受到什麼影響,但我很驚訝它是必需的。您是否在使用這個新網絡進行更多培訓? –

+0

有沒有簡單的方法來設置不同層結構的權重?我不使用它的訓練的確,它只是簡單的設置權重這樣 –

相關問題