Google released paper + code + dataset + pre-trained model about their new T5, beating state-of-the-art in 17/24 tasks.
+1, it is a very impressive work
https://github.com/google-research/text-to-text-transfer-transformer
However i would prefer seeing Albert implemented before T5.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Please
It's not super-well documented, but it's clearly present:
https://github.com/huggingface/transformers/blob/dc17f2a1110aed8d1729e77b0619601e3d96b84e/src/transformers/modeling_tf_t5.py
Most helpful comment
https://github.com/google-research/text-to-text-transfer-transformer
However i would prefer seeing Albert implemented before T5.