Transformers: Bug using Roberta models in QA Transformers pipeline.

Created on 20 May 2020  路  13Comments  路  Source: huggingface/transformers

馃悰 Bug

Hello, I cant use any roberta model with pipeline('question-answering'), someone can help me in how to fix this issue?

OBS=This error appears just when I use Roberta models.

ERROR:

image

Modeling Pipeline wontfix

Most helpful comment

It's hard for me to test if you give an image. Can you paste the code? If you already have transformers==2.7.0 installed, your !pip install transformers==2.10.0 won't work. You need to add the --upgrade or -U flag.

Can you add

from transformers import __version__
print(__version__)

just to make sure?

All 13 comments

Hi, could you please post a code sample and a textual error, rather than an image? Thanks.

@LysandreJik yes, its very simple my code I just trying to run a transformers example.

 if __name__ == '__main__':
      import ipywidgets as widgets
      from transformers.pipelines import pipeline
      from transformers.modeling_auto import AutoModelForQuestionAnswering
      from transformers.tokenization_auto import AutoTokenizer

      tokenizer = AutoTokenizer.from_pretrained("deepset/roberta-base-squad2")
      model = AutoModelForQuestionAnswering.from_pretrained("deepset/roberta-base-squad2")
      nlp_qa = pipeline('question-answering', model=model, tokenizer=tokenizer, device = 0)
      X = nlp_qa(context="text document.txt", question='What is this project?')
      print(X)

And runing with this albert or any another albert I got this error:

  File "c:/Users/tioga/Desktop/Tranformers/transformers_test.py", line 44, in <module>
      X = nlp_qa(context=st, question='What is this project?')
  File "C:\Python\lib\site-packages\transformers\pipelines.py", line 1042, in __call__
     for s, e, score in zip(starts, ends, scores)
  File "C:\Python\lib\site-packages\transformers\pipelines.py", line 1042, in <listcomp>
     for s, e, score in zip(starts, ends, scores)
 KeyError: 0

I can reproduce this error, but it is working with other models for me. Pinging @tholor who might know what is going on.

hi guys, anyone managed to understand what the above issue is? I am facing the same issue.
Thanks.

I believe this was fixed in #4049, which is available in the latest release v2.10.0. What are your installed transformers versions?

@LysandreJik I was using 2.7.0, but I still get the same error using 2.10.0

Using the exact code sample mentioned above? Are you using different code?

I have the exact issue with one of my Roberta models.. But I tried exact code now
Screen Shot 2020-05-26 at 6 38 24 AM

It's hard for me to test if you give an image. Can you paste the code? If you already have transformers==2.7.0 installed, your !pip install transformers==2.10.0 won't work. You need to add the --upgrade or -U flag.

Can you add

from transformers import __version__
print(__version__)

just to make sure?

@LysandreJik works for me. Thank you.

Glad I could help!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Hi,

When I use transformers 2.4.0, it is working without the error, but with 3.0.2 I will get the same error!

So when the context has the answer in it, everything is fine, when it has not, I get the same error.
Example:

from transformers.pipelines import pipeline
name="ktrapeznikov/albert-xlarge-v2-squad-v2"
nlp=pipeline('question-answering',model=name,tokenizer=name,device=-1)

This example won't cause any errors and I get the right answer:

qa_input =  {'question': 'Is the company listed on any stock exchange?', 'context': 'Roche Corporate Executive Committee on 31 December 2019.  We are dedicated to long-term success. Roche is listed on New York stock exchange.'}
qa_response = nlp(qa_input)

This will cause the error:

qa_input =  {'question': 'Is the company listed on any stock exchange?', 'context': 'Roche Corporate Executive Committee on 31 December 2019.  We are dedicated to long-term success.'}
qa_response = nlp(qa_input)

Can you verify that it is not working with 3.0.2 ?
Do you have any solutions or I should just use older versions for now to work with?

Thanks!

Was this page helpful?
0 / 5 - 0 ratings