Transformers: [Model] mT5 Cross-Lingual Model

Created on 28 Oct 2020  路  3Comments  路  Source: huggingface/transformers

馃専 New model addition

Model description

Multilingual T5 (mT5) is a massively multilingual pretrained text-to-text transformer model, trained following a similar recipe as T5.

Weights, code are available.

Github Repo: mT5 Weights and Code

Paper: mT5: A massively multilingual pre-trained text-to-text transformer

Open source status

  • [x] the model implementation is available: Implementation
  • [x] the model weights are available: checkpoints
  • [x] who are the authors: (@craffel, @adarob)
New model

Most helpful comment

@julien-c thanks for your amazing nlp lib.
When do you plan to support mT5 ?
When #6285 will be release ?
Cheers
Philippe

All 3 comments

Will be a part of #6285

Hey, @sumanthd17 any update on this?

@julien-c thanks for your amazing nlp lib.
When do you plan to support mT5 ?
When #6285 will be release ?
Cheers
Philippe

Was this page helpful?
0 / 5 - 0 ratings

Related issues

siddsach picture siddsach  路  3Comments

fabiocapsouza picture fabiocapsouza  路  3Comments

HansBambel picture HansBambel  路  3Comments

alphanlp picture alphanlp  路  3Comments

quocnle picture quocnle  路  3Comments