Transformers: Is there pre-train bert or xlnet from scratch code ?

Created on 29 Apr 2020  ยท  3Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help

Details


A link to original question on Stack Overflow:

LM (Pretraining)

Most helpful comment

@xealml Curious whether you managed to train XLNet? If so, any pointers you could share?

All 3 comments

Yes, you can use the run_language_modeling.py script for pre-training e.g. BERT from scratch:

https://huggingface.co/transformers/examples.html#language-model-training

(Just leave the model_name_or_path parameter empty for pre-training from scratch)

@stefan-it thank u.

@xealml Curious whether you managed to train XLNet? If so, any pointers you could share?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

fabiocapsouza picture fabiocapsouza  ยท  3Comments

HansBambel picture HansBambel  ยท  3Comments

hsajjad picture hsajjad  ยท  3Comments

lemonhu picture lemonhu  ยท  3Comments

siddsach picture siddsach  ยท  3Comments