在Keras文檔中的訓練示例,使用在Keras(Tensorflow後端)binary_crossentropy損失
https://keras.io/getting-started/sequential-model-guide/#training
binary_crossentropy被使用並且乙狀結腸激活在網絡的最後一層添加,但是它需要在最後一層添加sigmoid?正如我在源代碼中發現:
def binary_crossentropy(output, target, from_logits=False):
"""Binary crossentropy between an output tensor and a target tensor.
Arguments:
output: A tensor.
target: A tensor with the same shape as `output`.
from_logits: Whether `output` is expected to be a logits tensor.
By default, we consider that `output`
encodes a probability distribution.
Returns:
A tensor.
"""
# Note: nn.softmax_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
if not from_logits:
# transform back to logits
epsilon = _to_tensor(_EPSILON, output.dtype.base_dtype)
output = clip_ops.clip_by_value(output, epsilon, 1 - epsilon)
output = math_ops.log(output/(1 - output))
return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
Keras調用在Tensorflow sigmoid_cross_entropy_with_logits,但在sigmoid_cross_entropy_with_logits功能,乙狀結腸(logits)再次計算。
https://www.tensorflow.org/versions/master/api_docs/python/tf/nn/sigmoid_cross_entropy_with_logits
所以我不認爲這是有道理的,在最後添加乙狀結腸,但似乎所有的二級/多標籤分類的例子和教程Keras我在網上找到添加乙狀結腸在持續。此外我不明白
的意思是什麼# Note: nn.softmax_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
爲什麼Keras期望概率?是不是使用nn.softmax_cross_entropy_with_logits函數?是否有意義?
謝謝。