TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'
now you should use:
model = BertModel.from_pretrained('bert-base-cased', output_hidden_states=True)
outputs = model(input_ids)
all_hidden_states = outputs[-1]
Note that the first element in all_hidden_states (all_hidden_states[0]) is the output of the embedding layers (hence the fact that there is num_layers + 1 elements in all_hidden_states).
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
now you should use:
Note that the first element in
all_hidden_states(all_hidden_states[0]) is the output of the embedding layers (hence the fact that there isnum_layers + 1elements inall_hidden_states).