1
我是tensorflow的新手,我瀏覽了教程併成功執行了它們。 現在我必須解決一個問題,其中我的輸出不應該像MNIST標籤(1-10)那樣是絕對的。我想計算圖像中的對象,因此我只需要一個數值輸出值,因爲結果可以在0-300 +之間的範圍內,所以在單熱矢量中編碼的結果不適用。tensorflow一個數字輸出
我的代碼在下面(大部分是從MNIST tut複製的)它工作正常,如果我有多個類和標籤編碼在一個熱矢量。
我想我必須調整的成本函數:
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(prediction, y))
,但我不知道怎麼樣。任何人都可以幫助我 所以預測返回一個值,y(地面實況)也是一個值,例如。[5]爲5個對象。
### CNN CONFIG
n_classes = 1
batch_size = 100
x = tf.placeholder('float', [None, 16384]) # 128*128 = 16384 28*28 = 784
y = tf.placeholder('float')
keep_rate = 0.8
keep_prob = tf.placeholder(tf.float32)
def conv2d(x, W):
return tf.nn.conv2d(x, W, strides=[1, 1, 1, 1], padding='SAME')
def maxpool2d(x):
# size of window movement of window
return tf.nn.max_pool(x, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='SAME')
def convolutional_neural_network(x):
weights = {'W_conv1': tf.Variable(tf.random_normal([5, 5, 1, 32])),
'W_conv2': tf.Variable(tf.random_normal([5, 5, 32, 64])),
'W_fc': tf.Variable(tf.random_normal([32 * 32 * 64, 1024])),
'out': tf.Variable(tf.random_normal([1024, n_classes]))}
biases = {'b_conv1': tf.Variable(tf.random_normal([32])),
'b_conv2': tf.Variable(tf.random_normal([64])),
'b_fc': tf.Variable(tf.random_normal([1024])),
'out': tf.Variable(tf.random_normal([n_classes]))}
x = tf.reshape(x, shape=[-1, 128, 128, 1])
conv1 = tf.nn.relu(conv2d(x, weights['W_conv1']) + biases['b_conv1'])
conv1 = maxpool2d(conv1)
conv2 = tf.nn.relu(conv2d(conv1, weights['W_conv2']) + biases['b_conv2'])
conv2 = maxpool2d(conv2)
#fc = tf.reshape(conv2, [-1, 7 * 7 * 64])
fc = tf.reshape(conv2, [-1, 32 * 32 * 64])
fc = tf.nn.relu(tf.matmul(fc, weights['W_fc']) + biases['b_fc'])
#fc = tf.nn.dropout(fc, keep_rate)
output = tf.matmul(fc, weights['out']) + biases['out']
return output
def train_neural_network(x):
prediction = convolutional_neural_network(x)
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(prediction, y))
optimizer = tf.train.AdamOptimizer().minimize(cost)
#saver
saver = tf.train.Saver()
hm_epochs = 50
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
for epoch in range(hm_epochs):
epoch_loss = 0
logging.debug('Epoch: ' + str(epoch) +' started')
for i in range(int(len(train_database['images'])/batch_size)):
epoch_x, epoch_y = getNextBatch(train_database, (i + 1) * batch_size)
_, c = sess.run([optimizer, cost], feed_dict={x: epoch_x, y: epoch_y})
epoch_loss += c
print('Epoch', epoch, 'completed out of', hm_epochs, 'loss:', epoch_loss)
好嗎謝謝你,我代替我的成本函數,現在我再訓練我的模型。到目前爲止它看起來不錯。謝謝 :-) – Biba