Transformers: BART/T5 seq2seq example

Created on 13 Feb 2020  路  6Comments  路  Source: huggingface/transformers

馃殌 Feature request

Can we have a seq2seq example with training/fine-tuning and generation for BART/T5 models?

seq2seq

Most helpful comment

We are hard at work on this! I'd estimate 6 weeks out.

All 6 comments

We are hard at work on this! I'd estimate 6 weeks out.

Looking forward to this for the T5 model :)

@sshleifer any updates?

The example doesn't seem to show training/fine-tuning, only evaluation of already fine-tuned models.

@sshleifer Hello, any updates for training/fine-tuning on text generation for T5 model ?

summarization/bart/finetune.py supports T5.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

lemonhu picture lemonhu  路  3Comments

0x01h picture 0x01h  路  3Comments

zhezhaoa picture zhezhaoa  路  3Comments

lcswillems picture lcswillems  路  3Comments

chuanmingliu picture chuanmingliu  路  3Comments