How can I run BART pre-training?
I have data to pre-training(Masked LM)
This should help: https://github.com/huggingface/transformers/issues/5096#issuecomment-645860271
@sshleifer - think this is the 3rd issue about Bart pre-training -> maybe it would be a good idea to release a small notebook at some point.
@patil-suraj you took a stab at this at some point? this may have been optimistic :(
Yes, I was trying to port fairseq dataset here, same for t5, I'll try to focus more on it when I'm done with current PRs, should strat with a notebook as Patrick said, then try to include it in examples/
@patrickvonplaten Does that mean I can train with Masked-input, input(label) and Decoder-input?
yes, this should be possible
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@patil-suraj any news on the pretraining script for Bart?
Most helpful comment
Yes, I was trying to port fairseq dataset here, same for t5, I'll try to focus more on it when I'm done with current PRs, should strat with a notebook as Patrick said, then try to include it in examples/