Hi all! I'm getting this error when trying to run the example code:
Can't load tokenizer for 'facebook/rag-token-base/question_encoder_tokenizer'. Make sure that:
'facebook/rag-token-base/question_encoder_tokenizer' is a correct model identifier listed on 'https://huggingface.co/models'
or 'facebook/rag-token-base/question_encoder_tokenizer' is the correct path to a directory containing relevant tokenizer files
Was fixed on master, could you try from master?
cc @lhoestq @patrickvonplaten
Thanks @julien-c ! It worked using master. But I had this other issue:
Using custom data configuration dummy.psgs_w100.nq.no_index
Reusing dataset wiki_dpr (/Users/rcoutin/.cache/huggingface/datasets/wiki_dpr/dummy.psgs_w100.nq.no_index-dummy=True,with_index=False/0.0.0/14b973bf2a456087ff69c0fd34526684eed22e48e0dfce4338f9a22b965ce7c2)
Using custom data configuration dummy.psgs_w100.nq.exact
Reusing dataset wiki_dpr (/Users/rcoutin/.cache/huggingface/datasets/wiki_dpr/dummy.psgs_w100.nq.exact-80150455dfcf97d4/0.0.0/14b973bf2a456087ff69c0fd34526684eed22e48e0dfce4338f9a22b965ce7c2)
Traceback (most recent call last):
File "/Users/rcoutin/git/examples/backup/rag.py", line 5, in <module>
model = RagTokenForGeneration.from_pretrained("facebook/rag-token-nq", retriever=retriever)
File "/Users/rcoutin/git/transformers/src/transformers/modeling_utils.py", line 947, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/Users/rcoutin/git/transformers/src/transformers/models/rag/modeling_rag.py", line 1009, in __init__
self.rag = RagModel(config=config, question_encoder=question_encoder, generator=generator, retriever=retriever)
File "/Users/rcoutin/git/transformers/src/transformers/models/rag/modeling_rag.py", line 487, in __init__
question_encoder = AutoModel.from_config(config.question_encoder)
File "/Users/rcoutin/git/transformers/src/transformers/models/auto/modeling_auto.py", line 615, in from_config
return MODEL_MAPPING[type(config)](config)
File "/Users/rcoutin/git/transformers/src/transformers/models/dpr/modeling_dpr.py", line 514, in __init__
self.question_encoder = DPREncoder(config)
File "/Users/rcoutin/git/transformers/src/transformers/models/dpr/modeling_dpr.py", line 155, in __init__
self.bert_model = BertModel(config)
File "/Users/rcoutin/git/transformers/src/transformers/models/bert/modeling_bert.py", line 764, in __init__
self.embeddings = BertEmbeddings(config)
File "/Users/rcoutin/git/transformers/src/transformers/models/bert/modeling_bert.py", line 181, in __init__
self.position_embedding_type = config.position_embedding_type
AttributeError: 'DPRConfig' object has no attribute 'position_embedding_type'
Not sure about this one, sorry :/ Calling the RAG gurus!
Thanks man. I’ll try debug little more my env. Thanks!
Em qua, 25 de nov de 2020 às 20:22, Julien Chaumond <
[email protected]> escreveu:
Not sure about this one, sorry :/ Calling the RAG gurus!
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/huggingface/transformers/issues/8780#issuecomment-733988235,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ABBZZ6SQTOLH4JHX3MVUUTTSRWGT7ANCNFSM4UCODMUA
.
Looks like the issue comes from the changes of #8276
cc @patrickvonplaten @LysandreJik @zhiheng-huang
Thanks a lot for spotting the bug @racoutinho and pinpointing it @lhoestq. The PR should fix it
I love how well maintained this repo is ❤️
Just ran into this issue yesterday, and was very surprised to see it fixed just 1 day later 👍
Thank you, guys!!!! You are rock stars!!!!
Most helpful comment
I love how well maintained this repo is ❤️
Just ran into this issue yesterday, and was very surprised to see it fixed just 1 day later 👍