2017-09-15 14 views
0

我保存了一個基於循環神經網絡的訓練模型。當我運行以下函數'lstm_vector_predict()'時,即使加載相同的模型,它也會每次返回不同的值。預測值時,張量流是否使用一些隨機數生成?爲什麼每次使用我的張量流模型時都會得到不同的結果?

import get_list_of_values_to_input 
import tensorflow as tf 
import tensorflow.contrib.learn as tflearn 
import tensorflow.contrib.layers as tflayers 
from tensorflow.contrib.learn.python.learn import learn_runner 
import tensorflow.contrib.metrics as metrics 
import tensorflow.contrib.rnn as rnn 
import numpy as np 


from backend.common.numpy_array_to_numpy_array_of_arrays import get_numpy_arrays_from_numpy_matrix 

def lstm_vector_predict(model_name='sample_model_vector.meta', number_of_tickers=2, batch_size=20,number_of_points=100, start_time=1489462200): 
    tf.reset_default_graph() 
    inputs = number_of_tickers 
    hidden = 100 
    output = number_of_tickers 
    current_time = start_time 

    X = tf.placeholder(tf.float32, [None, batch_size, inputs]) 
    # This is low level tensor flow stuff used for preparing output of data generation 
    basic_cell = tf.contrib.rnn.BasicRNNCell(num_units=hidden, activation=tf.nn.relu) 
    rnn_output, states = tf.nn.dynamic_rnn(basic_cell, X, dtype=tf.float32) 
    stacked_rnn_output = tf.reshape(rnn_output, [-1, hidden]) 
    stacked_outputs = tf.layers.dense(stacked_rnn_output, output) 
    outputs = tf.reshape(stacked_outputs, [-1, batch_size, output]) 
    # We get the saver ready 
    saver = tf.train.import_meta_graph(model_name) 
    init = tf.global_variables_initializer() 

    # Later, launch the model, use the saver to restore variables from disk, and 
    # do some work with the model. 
    return_values = [] 
    with tf.Session() as sess: 
     # Restore variables from disk. 
     saver.restore(sess, tf.train.latest_checkpoint('./')) 
     print("Model restored.") 
     # Check the values of the variables 
     sess.run(init) 
     for i in range(number_of_points): 
      last_values = get_list_of_values_to_input() 
      print("Generating point", i) 
      #x_generators = last_values[-batch_size:] 
      x_generators = last_values[-batch_size:].reshape(-1, batch_size, number_of_tickers) 
      y_forecast = sess.run(outputs, feed_dict={X: x_generators}) 
      return_values.append(y_forecast[-1][-1]) 
      current_time += 300 
    return return_values 
+0

假設你的檢查點文件沒有改變,'get_list_of_values_to_input'沒有改變,另一種可能是你正在加載的模型('sample_model_vector.meta')包含一些隨機操作。 IE,tf.Variable()默認使用隨機初始化器 –

+0

加載變量後不要運行init操作。這將覆蓋其恢復的值。 –

回答

0

你會看到因爲LSTM模型的隨機性質不同的結果,因爲它是很難解決的隨機種子LSTM模式,以獲得100%可重複的結果。

相關問題