Models: 馃檹 [Help wanted for Model Garden] MobileBERT (a Compact Task Agnostic BERT for Resource Limited Devices)

Created on 19 May 2020  路  4Comments  路  Source: tensorflow/models

Help wanted: Research paper code and models

If you want to contribute, please leave a comment to express your interest.

Research paper

Paper

MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices

Current implementation

Status

  • In progress

Mentor

  • Hongkun Yu (@saberkun)

Contributors

  • @vufg

Tasks

  • Convert the TensorFlow 1 checkpoints to TensorFlow 2
  • Reproduce results on finetuning tasks
  • Convert a pre-trained model to a TensorFlow Lite format

Development environment

  • Development branch: mobilebert

    • Development work should be done using the mobilebert branch.
    • All changes will be merged into the master branch after finishing the implementation.
  • Target

    • Repository: Official NLP directory
    • Directory: https://github.com/tensorflow/models/tree/master/official/nlp/mobilebert

Requirements

License

By contributing, you agree that your contributions will be licensed under Apache License 2.0.

paper implementation official

Most helpful comment

Hi Harsh188,

Thanks for your interest. We have released the mobile_bert_encoder.py and will release the converted checkpoint and more soon.

huggingface/transformers#4901 (comment) Seems to have completed the requirements?

All 4 comments

Working on it

https://github.com/huggingface/transformers/pull/4901#issue-432363045 Seems to have completed the requirements?

Hi Harsh188,

Thanks for your interest. We have released the mobile_bert_encoder.py and will release the converted checkpoint and more soon.

huggingface/transformers#4901 (comment) Seems to have completed the requirements?

Was this page helpful?
0 / 5 - 0 ratings