Model I am using: BertModel and AutoModel
Language I am using the model on: English
Steps to reproduce the behavior:
from transformers.modeling_auto import AutoModel
from transformers.modeling_bert import BertModel
bert_model = BertModel.from_pretrained('bert-base-uncased', torchscript=True)
bert_model = AutoModel.from_pretrained('bert-base-uncased', torchscript=True)
bert_model = AutoModel.from_pretrained('bert-base-uncased', torchscript=True) raises a
TypeError: __init__() got an unexpected keyword argument 'torchscript'
Successfully create a BertModel object using AutoModel class.
transformers version: 2.9.0Darwin Kernel Version 19.4.0: Wed Mar 4 22:28:40 PST 2020; root:xnu-6153.101.6~15/RELEASE_X86_64This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue still exists and is relevant to me. Can we get an official response?
Yes, I would also like an official answer, please.
It appears that AutoConfig accepts a torchscript keyword parameter. The AutoConfig object can then be passed as the config keyword parameter to AutoModel. Hope this workaround helps @jonsnowseven
Hi! This was fixed by https://github.com/huggingface/transformers/pull/5665. Could you try to install from source and try again?