Transformers: sentencepiece==0.1.92 causing segmentation fault

Created on 9 Jun 2020  路  12Comments  路  Source: huggingface/transformers

馃悰 Bug

Information

transformers==2.9.1
torch==1.4.0

Starting from today,
I notice newly released sentencepiece==0.1.92 causing segmentation fault while calling torch functions.
Downgrade to sentencepiece==0.1.91 solve it.

dependencies

Most helpful comment

Cf #8199 we will remove the hard dependency on sentencepiece (replaced by the tokenizers library) in a coming release, probably end of next week.

All 12 comments

@boy2000-007man Hi, folk. Just curious about how do you find this bug? It costs me almost the whole day... Anyway, thank you so much!

OMG!!! Awesome Advice!!!!

I spent a whole night to address the dependencies problems and almost lost my mind. This answer saved my life. Appreciate!

Thanks for this! Also curious how you worked this out - I've spent a whole day trying to figure this out!

Thanks so much you saved my day.

I was dreading the thought of having to dive into this issue with faulthandler and meticulously cross referencing dependencies with a working version....but this post just saved my night. Thanks @boy2000-007man

This seems like a new pytorch v1.4.0 incompatibility issue with the latest huggingface releases. I'm assuming this may have been missed due to the focus on v1.5.0 support, but it seems like many people cannot make the jump to cuda 10.2/pytorch 1.5.0 currently, so this seems like a pretty big headache that should be addressed.

Closing this as solved by #5418

You are excellent!

same problem when use sentencepiece==0.1.94

Having the same problem with sentencepiece==0.1.94

Cf #8199 we will remove the hard dependency on sentencepiece (replaced by the tokenizers library) in a coming release, probably end of next week.

Thank you a lot! You saved my day!

Was this page helpful?
0 / 5 - 0 ratings