Flair: Pretrained Embeddings for multiple languages.

Created on 16 Mar 2019  路  5Comments  路  Source: flairNLP/flair

Hello. I have been looking around to figure out the usage of pretrained embeddings for multiple languages. What I am trying to achieve is to have pretrained embeddings for multiple languages so that my ML model knows similar words across multiple languages. For example, "Good" in English is the same as "Gut" in German. Any help in this regard would be highly appreciated

question

All 5 comments

And the MUSE library contains several pretrained word embeddings for English-X :)

@stefan-it Thank you very much for replying so quickly. I will check the papers and vecmap library as you pointed out. I have a question about MUSE though. For instance, German-English there is an entry "mit with", would that mean that I can use the embeddings of with for mit, or vice versa, so that the similar words can be clustered?

I haven't tried it yet, but there's a nice Notebook that shows how to get nearest neighbors and even visualize bilingual embeddings:

grafik

See here:

https://github.com/facebookresearch/MUSE/blob/master/demo.ipynb

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

happypanda5 picture happypanda5  路  3Comments

mnishant2 picture mnishant2  路  3Comments

Aditya715 picture Aditya715  路  3Comments

aschmu picture aschmu  路  3Comments

prematurelyoptimized picture prematurelyoptimized  路  3Comments