Hi,
XLNet is a new pre-training method for NLP, that adopts some ideas from the Transformer-XL architecture and was published today:
XLNet: Generalized Autoregressive Pretraining for Language Understanding.
I just read in the pytorch-pretrained-BERT repo, that they're planning to integrate it 馃槏
So whenever an implementation is available there, I would like to add a new embedding class XLNetEmbeddings so that it can be used in flair for downstream tasks.
I am extremely interested in this!
+1 for this
Yes that would be awesome!
Huggingface just released this -> https://github.com/huggingface/pytorch-transformers.
Would like to see the implementation here as well .. hope the development is on track !!
Should this be closed since https://github.com/zalandoresearch/flair/pull/941 has been merged?
Yes, good point! :)
Hi @stefan-it and @alanakbik
Could you please advise how to use this for Arabic language embeddings?
I am not sure if there are XLNet embeddings for Arabic, check here: https://huggingface.co/models?search=xlnet
But you can use any Arabic BERT model from this list: https://huggingface.co/models?search=arabic
You can load any model you find here with the TransformerWordEmbeddings or TransformerDocumentEmbeddings classes, depending on whether you want to embed words or documents:
embeddings = TransformerWordEmbeddings(
model=' asafaya/bert-base-arabic',
)
Thank yiu
I am not sure if there are XLNet embeddings for Arabic, check here: https://huggingface.co/models?search=xlnet
But you can use any Arabic BERT model from this list: https://huggingface.co/models?search=arabic
You can load any model you find here with the
TransformerWordEmbeddingsorTransformerDocumentEmbeddingsclasses, depending on whether you want to embed words or documents:embeddings = TransformerWordEmbeddings( model=' asafaya/bert-base-arabic', )
Thank you @alanakbik unfortunately there is no pretrained Arabic XLNET model on huggingface. I already tried Arabic BERT model since while
Most helpful comment
Yes that would be awesome!