Facebook AI is introducing,
M2M-100
the first multilingual machine translation (MMT) model that translates between any pair of 100 languages without relying on English data.
This model is very big. Is there a good way to prune it?
Moving to #8054 which is a duplicate (that I created)!
Most helpful comment
This model is very big. Is there a good way to prune it?