We just open-sourced FastFormers which are our SustaiNLP 2020 systems (FastFormers: Highly Efficient Transformer Models for Natural Language Understanding paper).
Currently, we are hosting this on our repository, but would like to merge it back to the transformers repository as an example.
our repo - https://github.com/microsoft/fastformers
For the purpose of the shared task, this is purely implemented with SuperGLUE data set.
So, it's dependent on Alex Wang(@W4ngatang)'s SuperGLUE data processing pipeline.
Also, many parts of the implementations are based on Alex'.
(https://github.com/W4ngatang/transformers/tree/superglue)
What would be the best way to merge this back?
Hi! This is great, thanks for offering to contribute it! From what I understand, FastFormers contains several scripts that can be applied to transformers models out of the box, that is, training, distillation, pruning, using quantization alongside the onnx runtime and fp16 optimizations.
Is that correct? If that is so, the easiest way would be to add the corresponding scripts to the examples/ directory, probably under examples/fastformers. If there are modifications made to the model themselves, we can take a look together at how we can integrate those in the library.
Hi, thanks for your interest! From what I understand, I think your model falls in the category of dynamic acceleration. For these types of paper, I recommend you to integrate it to examples/, just like PABEE and DeeBERT. I've emailed you an invitation to our Slack channel if it works for you. cc @LysandreJik
@LysandreJik yes, that is correct. Thanks @JetRunner, let's discuss more on the slack.
Most helpful comment
Hi, thanks for your interest! From what I understand, I think your model falls in the category of dynamic acceleration. For these types of paper, I recommend you to integrate it to
examples/, just like PABEE and DeeBERT. I've emailed you an invitation to our Slack channel if it works for you. cc @LysandreJik