Efficient Transformer with locality-sensitive hashing and reversible layers
https://openreview.net/forum?id=rkgNKkHtvB
I have started to refactor the original source code in Pytorch if you'd like to help I'd greatly appreciate it! https://github.com/zbloss/reformer
I have a working implementation at https://github.com/lucidrains/reformer-pytorch !
Any update on adding this to the library?
They published in their blog about it https://ai.googleblog.com/2020/01/reformer-efficient-transformer.html?m=1
Hence my interest in a huggingface implementation :)
Looking forward to see this model in the transformers lib :)
Don't think we should rush this one. The reformer paper is pretty tricky to implement in a clean way, plus there aren't any pre-trained models that use it yet. Just one person's opinion, though.
The implementation by @lucidrains seems to work https://github.com/lucidrains/reformer-pytorch ; it'd be cool if it was included in the transformers library. It seems strange to me that no pretrained Reformer has been uploaded since the paper was released, any ideas why? is it possible that it doesn't work in practice as stated by the authors in the paper? Anyone who has trained a Reformer on their own and have tried it to solve a real problem?
Thank you very much in advance
Same here, curious to know why. Thank you!
https://github.com/google/trax/blob/master/trax/models/reformer/machine_translation.ipynb
There should be an pretrained model now.
Would be very happy to see Reformer model in this project.
+1
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Closed by @patrickvonplaten
has this been done ?
Most helpful comment
I have a working implementation at https://github.com/lucidrains/reformer-pytorch !