Fairseq: BART Pretraining Script

Created on 24 Mar 2020  ·  4Comments  ·  Source: pytorch/fairseq

❓ Questions and Help

First of all, thanks for the sharing BART model checkpoints and codes to run.

What is your question?

Could you provide a pertaining script used for BART models?

I hope to train the BART model in my own language.
(Of course, I am aware of mBART models that support other languages, but my target task is not MT so I believe training BART on other language data only might be better.)
Although I could figure out configurations base on the paper, it is prone to miss some important details for training.
Training scripts like RoBERTa (https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.pretraining.md) would be highly beneficial.

Thanks a lot in advance!

question

All 4 comments

@ngoyal2707 @yinhanliu

❓ Questions and Help

First of all, thanks for the sharing BART model checkpoints and codes to run.

What is your question?

Could you provide a pertaining script used for BART models?

I hope to train the BART model in my own language.
(Of course, I am aware of mBART models that support other languages, but my target task is not MT so I believe training BART on other language data only might be better.)
Although I could figure out configurations base on the paper, it is prone to miss some important details for training.
Training scripts like RoBERTa (https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.pretraining.md) would be highly beneficial.

Thanks a lot in advance!

Hello,

I am also very interested in training a customized BART. Have you got any updates?

I'm also very interested in pretraining script. Any update ? @ngoyal2707 @yinhanliu

Hi is there any updates on BART pertaining script?

Was this page helpful?
0 / 5 - 0 ratings