Hey,
Once I've fine-tuned the Language Model, how can I get it to generate new text ? Is there any example available ?
Thanks !
Also interested in this!
Hi,
It's quite difficult to use BERT to generate text as BERT is not a causal language model per se.
Here is an example: https://github.com/nyu-dl/bert-gen by @W4ngatang and @kyunghyuncho.
Bert was not trained for text generation since it's not trained in the classical lm setting. However there are some new approaches that doesn't rely on next word predictions in the classical lm way. Have a look at: Insertion Transformer and Insertion-based Decoding.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
Hi,
It's quite difficult to use BERT to generate text as BERT is not a causal language model per se.
Here is an example: https://github.com/nyu-dl/bert-gen by @W4ngatang and @kyunghyuncho.