2017-08-07 177 views
1

我已經使用Pymc3建立了深刻的貝葉斯神經網絡之後,我已經訓練我的模型,並得到我需要的樣本。現在我'搜索保存這個擬合模型到磁盤 我試圖pickl,但是當我改變測試數據集的大小我得到這個錯誤 高清save_model(跟蹤,網絡,ann_input,NUM): 打印(「中」) 張開( 'my_model.pkl', 'WB')爲淺黃色: 和pickle.dump({ '模式':網絡, '追蹤':跟蹤},淺黃色)保存Pymc3模型到磁盤simmpling

f = open ('ann_input'+str(num)+'.pckl', 'wb') 
pickle.dump (ann_input, f) 
f.close() 

DEF load_model(NUM) : 開放( 'my_model.pkl', 'RB')爲淺黃色: 數據=和pickle.load(BUFF)

network, trace = data[ 'model' ], data[ 'trace' ] 

f = open ('ann_input'+str(num)+'.pckl', 'rb') 
ann_input = pickle.load (f) 
f.close() 

return trace, network, ann_input 

我得到這個錯誤

print(accuracy_score(y_pred,y_test)) 

文件 「d:\ Users \用戶的Wissam \應用程序數據\本地\程序\ Python的\ Python36 \ LIB \站點包\ sklearn \指標\ classification.py」,線路172,在accuracy_score y_type,y_true,y_pred = _check_targets(y_true,y_pred) 文件 「d:\ Users \用戶的Wissam \應用程序數據\本地\程序\ Python的\ Python36 \ LIB \站點包\ sklearn \指標\ classification.py」線72,在_check_targets check_consistent_length(y_true,y_pred) 文件 「d:\ Users \用戶的Wissam \應用程序數據\本地\程序\ Python的\ Python36 \ LIB \站點包\ sklearn \ utils的\ validation.py」,線路181, in check_consistent_length 「samples:%r」%[int(l)for l in len gths]) ValueError異常:實測值輸入變量與樣品的不一致數:[174,169]

我試圖也使用以下代碼

with neural_network: 
     step = pm.Metropolis() 
     print("start simpling") 
     db = pm.backends.Text ('test') 
     trace = pm.sample (10000,step, trace=db) 
     print ("end simpling") 
     from pymc3 import summary 
     summary(trace, varnames=['p']) 

使用後端和我得到以下錯誤

Traceback (most recent call last): 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\model.py", line 121, in get_context 
return cls.get_contexts()[-1] 
IndexError: list index out of range 

During handling of the above exception, another exception occurred: 

Traceback (most recent call last): 
File 
"D:/Users/wissam/PycharmProjects/git_repo/projetenovap/Training/ 
trainModel.py", 
line 301, in <module> 
trace = pm.backends.text.load('test') 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\backends\text.py", line 171, in load 
strace = Text(name, model=model) 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\backends\text.py", line 44, in __init__ 
super(Text, self).__init__(name, model, vars) 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\backends\base.py", line 31, in __init__ 
model = modelcontext(model) 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\model.py", line 131, in modelcontext 
return Model.get_context() 
File "D:\Users\wissam\AppData\Roaming\Python\Python36\site- 
packages\pymc3\model.py", line 123, in get_context 
raise TypeError("No context on context stack") 
TypeError: No context on context stack 

任何一個有關於拯救這個模型的想法?

+0

那麼我probleme解決了,我們應該只保存跟蹤(採樣數據),並且每個我們創建了一個新的神經網絡時間(僅保存權重不是所有的神經網絡) – user2856587

回答

1

那麼我probleme解決了,我們應該只保存跟蹤(採樣數據),並且每個我們創建了一個新的神經網絡時間(僅保存權重不是所有的神經網絡)

這是代碼的狗屎我已經使用

def predict(trace, test_X): 

     #create the model 
     X_test, X_train, y_test, y_train = loadDataset() 
     binary = sklearn.preprocessing.LabelBinarizer().fit (y_train) 
     y_2_bin = binary.transform (y_train) 
     ann_input = theano.shared (X_train) 
     n_hidden = 8; 
     nbHidden = 3; 
     # Initialize random weights between each layer 
     init_1 = np.random.randn (X_train.shape[ 1 ], n_hidden) 
     init_2 = [ ] 
     for i in range (nbHidden): 
      init_2.append (np.random.randn (n_hidden, n_hidden)) 
     init_out = np.random.randn (n_hidden, 3) 
     with pm.Model() as neural_network: 
      # Weights from input to hidden layer 
      weights_in_1 = pm.Normal ('w_in_1', 0, sd=1, 
             shape=(X_train.shape[ 1 ], n_hidden), 
             testval=init_1) 
      # Weights from 1st to 2nd layer 
      weights_1_2 = [ ] 
      for i in range (1, nbHidden, 1): 
       weights_1_2.append (pm.Normal ('w_' + str (i) + '_' + str (i + 1), 0, sd=1, 
               shape=(n_hidden, n_hidden), 
               testval=init_2[ i ])) 

      # Weights from hidden lay2er to output 

      weights_2_out = pm.Normal ('w_' + str (nbHidden) + '_out', 0, sd=1, 
             shape=(n_hidden, 3), 
             testval=init_out) 

      # Build neural-network using tanh activation function 
      act_1 = T.tanh (T.dot (ann_input, 
            weights_in_1)) 
      act_2 = [ ] 
      act_2.append (T.tanh (T.dot (act_1, 
             weights_1_2[ 0 ]))) 

      for i in range (1, nbHidden, 1): 
       act_2.append (T.tanh (T.dot (act_2[ i - 1 ], 
              weights_1_2[ i - 1 ]))) 

      act_out = T.nnet.softmax (T.dot (act_2[ nbHidden - 1 ], 
              weights_2_out)) 
      # 10 discrete output classes -> pymc3 categorical distribution 
      p = pm.Deterministic ('p', act_out) 
      # y_train [y_train==2]=0 
      # y_2_bin = sklearn.preprocessing.LabelBinarizer().fit_transform (y_train) 
      out = pm.Bernoulli ('out', p, observed=y_2_bin) 
      print("model etablis") 
     ann_input.set_value(test_X) 

     #use the saved trace which containes the weight 
     with neural_network: 
      print("start simpling") 
      ppc = pm.sample_ppc (trace, samples=1000) 
      print("end simpling") 

     #get the prediction 
     y_pred = ppc[ 'p' ] 

     #return the prediction 
     return y_pred 

保存跟蹤我已經使用這個功能

 #save trained model 
    def save_model(trace, network): 
      with open ('my_model.pkl', 'wb') as buff: 
      pickle.dump ({'model': network, 'trace': trace}, buff) 

重新加載它,我用

#reload trained model 
    def load_model(num): 
      with open ('my_model.pkl', 'rb') as buff: 
      data = pickle.load (buff) 

      network, trace = data[ 'model' ], data[ 'trace' ]