Transformers: from pytorch-pretrained-bert to pytorch-transformers锛宻ome problem

Created on 18 Jul 2019  路  2Comments  路  Source: huggingface/transformers

TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'

wontfix

Most helpful comment

now you should use:

model = BertModel.from_pretrained('bert-base-cased', output_hidden_states=True)
outputs = model(input_ids)
all_hidden_states = outputs[-1]

Note that the first element in all_hidden_states (all_hidden_states[0]) is the output of the embedding layers (hence the fact that there is num_layers + 1 elements in all_hidden_states).

All 2 comments

now you should use:

model = BertModel.from_pretrained('bert-base-cased', output_hidden_states=True)
outputs = model(input_ids)
all_hidden_states = outputs[-1]

Note that the first element in all_hidden_states (all_hidden_states[0]) is the output of the embedding layers (hence the fact that there is num_layers + 1 elements in all_hidden_states).

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings