Transformers: How can I generate new text after having fine-tuned BERT on a custom dataset ?

Created on 23 Mar 2019  路  4Comments  路  Source: huggingface/transformers

Hey,

Once I've fine-tuned the Language Model, how can I get it to generate new text ? Is there any example available ?
Thanks !

Discussion wontfix

Most helpful comment

Hi,
It's quite difficult to use BERT to generate text as BERT is not a causal language model per se.
Here is an example: https://github.com/nyu-dl/bert-gen by @W4ngatang and @kyunghyuncho.

All 4 comments

Also interested in this!

Hi,
It's quite difficult to use BERT to generate text as BERT is not a causal language model per se.
Here is an example: https://github.com/nyu-dl/bert-gen by @W4ngatang and @kyunghyuncho.

Bert was not trained for text generation since it's not trained in the classical lm setting. However there are some new approaches that doesn't rely on next word predictions in the classical lm way. Have a look at: Insertion Transformer and Insertion-based Decoding.

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

fabiocapsouza picture fabiocapsouza  路  3Comments

ereday picture ereday  路  3Comments

0x01h picture 0x01h  路  3Comments

alphanlp picture alphanlp  路  3Comments

siddsach picture siddsach  路  3Comments