Transformers: Is there pre-train bert or xlnet from scratch code ?

Created on 29 Apr 2020  ยท  3Comments  ยท  Source: huggingface/transformers

โ“ Questions & Help

Details


A link to original question on Stack Overflow:

LM (Pretraining)

Most helpful comment

@xealml Curious whether you managed to train XLNet? If so, any pointers you could share?

All 3 comments

Yes, you can use the run_language_modeling.py script for pre-training e.g. BERT from scratch:

https://huggingface.co/transformers/examples.html#language-model-training

(Just leave the model_name_or_path parameter empty for pre-training from scratch)

@stefan-it thank u.

@xealml Curious whether you managed to train XLNet? If so, any pointers you could share?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

delip picture delip  ยท  3Comments

0x01h picture 0x01h  ยท  3Comments

rsanjaykamath picture rsanjaykamath  ยท  3Comments

hsajjad picture hsajjad  ยท  3Comments

zhezhaoa picture zhezhaoa  ยท  3Comments