Transformers: Error while loading pretrained model with "return_dict=True"

Created on 22 Aug 2020  ยท  2Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help

torch: 1.6.0+cu101
Transformers: 3.0.2

Error with "return_dict=True"

from transformers import BertTokenizer, BertForPreTraining
import torch
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForPreTraining.from_pretrained('bert-base-uncased', return_dict=True)
TypeError                                 Traceback (most recent call last)
<ipython-input-3-5eca8cb45c88> in <module>()
      2 import torch
      3 tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
----> 4 model = BertForPreTraining.from_pretrained('bert-base-uncased', return_dict=True)

/usr/local/lib/python3.6/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    670 
    671         # Instantiate model.
--> 672         model = cls(config, *model_args, **model_kwargs)
    673 
    674         if state_dict is None and not from_tf:

TypeError: __init__() got an unexpected keyword argument 'return_dict'

Most helpful comment

Hi! I believe that parameter is only available on master right now, so you should install transformers from the master branch to use it (pip install git+https://github.com/huggingface/transformers). It'll be available in version 3.1.0 which will be released in a couple of days.

All 2 comments

Hi! I believe that parameter is only available on master right now, so you should install transformers from the master branch to use it (pip install git+https://github.com/huggingface/transformers). It'll be available in version 3.1.0 which will be released in a couple of days.

Working for me after the upgrade to 3.1.0 - thanks @LysandreJik

Was this page helpful?
0 / 5 - 0 ratings