Transformers: BART for Pre-Training

Created on 26 Aug 2020  ยท  8Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help

How can I run BART pre-training?
I have data to pre-training(Masked LM)

LM (Pretraining) wontfix

Most helpful comment

Yes, I was trying to port fairseq dataset here, same for t5, I'll try to focus more on it when I'm done with current PRs, should strat with a notebook as Patrick said, then try to include it in examples/

All 8 comments

@sshleifer - think this is the 3rd issue about Bart pre-training -> maybe it would be a good idea to release a small notebook at some point.

@patil-suraj you took a stab at this at some point? this may have been optimistic :(

Yes, I was trying to port fairseq dataset here, same for t5, I'll try to focus more on it when I'm done with current PRs, should strat with a notebook as Patrick said, then try to include it in examples/

@patrickvonplaten Does that mean I can train with Masked-input, input(label) and Decoder-input?

yes, this should be possible

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@patil-suraj any news on the pretraining script for Bart?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

lcswillems picture lcswillems  ยท  3Comments

HansBambel picture HansBambel  ยท  3Comments

fabiocapsouza picture fabiocapsouza  ยท  3Comments

zhezhaoa picture zhezhaoa  ยท  3Comments

fyubang picture fyubang  ยท  3Comments