Transformers: Can't load config for [community model]

Created on 3 Aug 2020  路  12Comments  路  Source: huggingface/transformers

Although I can use a fine-tuned GPT2 model from code, the model page complains about the config file (which is already uploaded).
at https://huggingface.co/akhooli/gpt2-small-arabic-poetry (for a prompt), I get:

Can't load config for 'akhooli/gpt2-small-arabic-poetry'. Make sure that: - 'akhooli/gpt2-small-arabic-poetry' is a correct model identifier listed on 'https://huggingface.co/models' - or 'akhooli/gpt2-small-arabic-poetry' is the correct path to a directory containing a config.json file 

All 12 comments

Currently working on a fix. Will update here

Should be fixed now

pinging @mfuntowicz on this

I am seeing the same issue with a new model I just uploaded (akhooli/xlm-r-large-arabic-sent). It works if called in code through HF pipeline.

You probably are already aware of this, but inference worked for a bit and broke again (Can't load config for [model]).

I had to revert the change I did because it was breaking one workflow we have on api-inference side. I'm working on having a stable patch by today, sorry for the inconvenience

Should be fixed now. Let us know 馃憤

Yup works

Confirmed (existing model then updated). Thanks for the fix and for HF great work!

It works now. Thanks! @mfuntowicz

Was this page helpful?
0 / 5 - 0 ratings