Transformers: When would pegasus be able to be exported in ONNX format?

Created on 1 Nov 2020  路  5Comments  路  Source: huggingface/transformers

It seems like it's not available now, I got this error:
Error while converting the model: Unrecognized configuration class <class 'transformers.configuration_pegasus.PegasusConfig'> for this kind of AutoModel: AutoModel. Model type should be one of RetriBertConfig, T5Config, DistilBertConfig, AlbertConfig, CamembertConfig, XLMRobertaConfig, BartConfig, LongformerConfig, RobertaConfig, LayoutLMConfig, SqueezeBertConfig, BertConfig, OpenAIGPTConfig, GPT2Config, MobileBertConfig, TransfoXLConfig, XLNetConfig, FlaubertConfig, FSMTConfig, XLMConfig, CTRLConfig, ElectraConfig, ReformerConfig, FunnelConfig, LxmertConfig, BertGenerationConfig, DebertaConfig, DPRConfig, XLMProphetNetConfig, ProphetNetConfig.

Which is fair since pegasus is a new addition. Is it something the team plans to do soon?

Or can someone point me some resources on if there are other ways to export a pre-trained model from huggingface? I'm pretty new to the machine learning thing :p

Thanks all!

Most helpful comment

@patil-suraj has a partial solution that he just posted to the forums. he might be able to extend that to Pegasus/BART

All 5 comments

@patil-suraj has a partial solution that he just posted to the forums. he might be able to extend that to Pegasus/BART

I'm on it! Will ping here once I get it working.

@phosfuldev, you can refer to this post to see how T5 is exported to onnx
https://discuss.huggingface.co/t/speeding-up-t5-inference/1841

@sshleifer @patil-suraj Thank you!!

Thank you so much @patil-suraj for taking the initiative to export Pegasus to onnx. Eagerly waiting for it :)

Hi @patil-suraj

Please let us know if you have any update on exporting Pegasus to Onnx format.

Apologies for bothering you.

Thanks,
Karthik

Was this page helpful?
0 / 5 - 0 ratings