Transformers: Longformer - Compression

Created on 6 Jul 2020  路  4Comments  路  Source: huggingface/transformers

Hello, I have trained a longformer model on my custom question answering dataset, and now I wanted to know if there is a way to compress this trained model?

Thank you!

wontfix

Most helpful comment

Did you upload the model to the model hub? I would be great so that we can take a closer look :-)

All 4 comments

You could check out the example script under distillation: /home/patrick/hugging_face/transformers/examples/distillation

Did you upload the model to the model hub? I would be great so that we can take a closer look :-)

Hello, thanks for the reply. I checked the train.py script under examples/distillation, but it seems that it does not cater to longformer models (the mentioned models within the script are BERT, RoBERTa, and GPT2). Yes, I have uploaded the model: https://s3.amazonaws.com/models.huggingface.co/bert/Nomi97/Chatbot_QA.

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

0x01h picture 0x01h  路  3Comments

alphanlp picture alphanlp  路  3Comments

rsanjaykamath picture rsanjaykamath  路  3Comments

iedmrc picture iedmrc  路  3Comments

lemonhu picture lemonhu  路  3Comments