is there an max sentence length for this bert code?
Hi, 512 tokens if you use the pre-trained models. Any length you want if you train your models from scratch.
could we set it smaller ? cause if i set it as 512, then result is out of memory
You can just send a smaller input in the model, no need to go to the max
thank you @thomwolf
Most helpful comment
Hi, 512 tokens if you use the pre-trained models. Any length you want if you train your models from scratch.