Can we have a seq2seq example with training/fine-tuning and generation for BART/T5 models?
We are hard at work on this! I'd estimate 6 weeks out.
Looking forward to this for the T5 model :)
@sshleifer any updates?
The example doesn't seem to show training/fine-tuning, only evaluation of already fine-tuned models.
@sshleifer Hello, any updates for training/fine-tuning on text generation for T5 model ?
summarization/bart/finetune.py
supports T5.
Most helpful comment
We are hard at work on this! I'd estimate 6 weeks out.