Hello, I have trained a longformer model on my custom question answering dataset, and now I wanted to know if there is a way to compress this trained model?
Thank you!
You could check out the example script under distillation: /home/patrick/hugging_face/transformers/examples/distillation
Did you upload the model to the model hub? I would be great so that we can take a closer look :-)
Hello, thanks for the reply. I checked the train.py script under examples/distillation, but it seems that it does not cater to longformer models (the mentioned models within the script are BERT, RoBERTa, and GPT2). Yes, I have uploaded the model: https://s3.amazonaws.com/models.huggingface.co/bert/Nomi97/Chatbot_QA.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
Did you upload the model to the model hub? I would be great so that we can take a closer look :-)