Is it possible to add Google's Transformer Model available at https://tfhub.dev/google/universal-sentence-encoder-large/3 as part of Document Embeddings within Flair?
I couldn't find any pytorch port of that model and it would be very handy to be able to run comparisons.
Yes, that is a great idea. We would like to get more transformer architectures into Flair, for instance to train transformer-based LMs.
I am not sure when we can get around to implementing this, so any help from the community would be greatly appreciated!
I think InferSent[1] would be a good alternative and a PyTorch implementation exists :)
[1] [Supervised Learning of Universal Sentence Representations from Natural Language Inference Data](https://arxiv.org/abs/1705.02364)
Here is also a good example. Who would like to implement it with me?
Here is also the link for the pertained large english model: https://storage.googleapis.com/tfhub-modules/google/universal-sentence-encoder-large/3.tar.gz
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
I think InferSent[1] would be a good alternative and a PyTorch implementation exists :)
[1] [Supervised Learning of Universal Sentence Representations from Natural Language Inference Data](https://arxiv.org/abs/1705.02364)