我想了解tf.nn.sparse_softmax_cross_entropy_with_logits的工作原理。 描述說: A common use case is to have logits of shape [batch_size, num_classes]
and labels of shape [batch_size]. But higher dimensions are suppor
我需要將單熱編碼轉換爲由唯一整數表示的類別。用下面的代碼創建的,因此一個熱編碼: from sklearn.preprocessing import OneHotEncoder
enc = OneHotEncoder()
labels = [[1],[2],[3]]
enc.fit(labels)
for x in [1,2,3]:
print(enc.transform([[x