The official BertForMaskedLM example: https://huggingface.co/transformers/model_doc/bert.html#bertformaskedlm
has a bug. When running: outputs = model(input_ids, labels=input_ids), it alerts: TypeError: forward() got an unexpected keyword argument 'labels'
Model I am using (Bert, XLNet ...):
BERT
Language I am using the model on (English, Chinese ...):
English
The problem arises when using:
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForMaskedLM.from_pretrained('bert-base-uncased')
input_ids = torch.tensor(tokenizer.encode("Hello, my dog is cute", add_special_tokens=True)).unsqueeze(0) # Batch size 1
outputs = model(input_ids, labels=input_ids)
Steps to reproduce the behavior:
should work with labels??
transformers version: 2.11.0Hi, @guoxuxu , lm_labels is changed to labels in a recent commit on master. If you are using master then use labels otherwise use lm_labels
@sgugger This change is causing a lot of confusion. Would it be a good idea to keep master and release docs separate ?
This has just been done. The documentation now shows the latest stable release (v2.11.0) and you have to opt-in to see the master documentation.
I'll work on a version selector next.
Thanks @sgugger !
I think we can close the issue as a result. Please reopen if any problem persists.
Most helpful comment
This has just been done. The documentation now shows the latest stable release (v2.11.0) and you have to opt-in to see the master documentation.
I'll work on a version selector next.