torch: 1.6.0+cu101
Transformers: 3.0.2
Error with "return_dict=True"
from transformers import BertTokenizer, BertForPreTraining
import torch
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForPreTraining.from_pretrained('bert-base-uncased', return_dict=True)
TypeError Traceback (most recent call last)
<ipython-input-3-5eca8cb45c88> in <module>()
2 import torch
3 tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
----> 4 model = BertForPreTraining.from_pretrained('bert-base-uncased', return_dict=True)
/usr/local/lib/python3.6/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
670
671 # Instantiate model.
--> 672 model = cls(config, *model_args, **model_kwargs)
673
674 if state_dict is None and not from_tf:
TypeError: __init__() got an unexpected keyword argument 'return_dict'
Hi! I believe that parameter is only available on master right now, so you should install transformers from the master branch to use it (pip install git+https://github.com/huggingface/transformers). It'll be available in version 3.1.0 which will be released in a couple of days.
Working for me after the upgrade to 3.1.0 - thanks @LysandreJik
Most helpful comment
Hi! I believe that parameter is only available on
masterright now, so you should installtransformersfrom themasterbranch to use it (pip install git+https://github.com/huggingface/transformers). It'll be available in version3.1.0which will be released in a couple of days.