我在嘗試加載保存的模型時出現以下錯誤。KeyError:'unexpected key'module.encoder.embedding.weight「in state_dict'
KeyError: 'unexpected key "module.encoder.embedding.weight" in state_dict'
這是我使用加載保存的模型的功能。
def load_model_states(model, tag):
"""Load a previously saved model states."""
filename = os.path.join(args.save_path, tag)
with open(filename, 'rb') as f:
model.load_state_dict(torch.load(f))
該模型是一個序列到序列的網絡,其初始函數(構造函數)在下面給出。
def __init__(self, dictionary, embedding_index, max_sent_length, args):
""""Constructor of the class."""
super(Sequence2Sequence, self).__init__()
self.dictionary = dictionary
self.embedding_index = embedding_index
self.config = args
self.encoder = Encoder(len(self.dictionary), self.config)
self.decoder = AttentionDecoder(len(self.dictionary), max_sent_length, self.config)
self.criterion = nn.NLLLoss() # Negative log-likelihood loss
# Initializing the weight parameters for the embedding layer in the encoder.
self.encoder.init_embedding_weights(self.dictionary, self.embedding_index, self.config.emsize)
當我打印模型(序列到序列網絡),我得到以下內容。
Sequence2Sequence (
(encoder): Encoder (
(drop): Dropout (p = 0.25)
(embedding): Embedding(43723, 300)
(rnn): LSTM(300, 300, batch_first=True, dropout=0.25)
)
(decoder): AttentionDecoder (
(embedding): Embedding(43723, 300)
(attn): Linear (600 -> 12)
(attn_combine): Linear (600 -> 300)
(drop): Dropout (p = 0.25)
(out): Linear (300 -> 43723)
(rnn): LSTM(300, 300, batch_first=True, dropout=0.25)
)
(criterion): NLLLoss (
)
)
所以,module.encoder.embedding
是一個埋層,並且module.encoder.embedding.weight
表示所述關聯權重矩陣。那麼,它爲什麼說 - unexpected key "module.encoder.embedding.weight" in state_dict
?