Could you please provide an example of XNLI tasks for XLM-RoBERTa?
Current example (https://github.com/pytorch/fairseq/tree/master/examples/xlmr) is quite simple and it is for single sentence.
Thanks a lot!
Hey, we will release XNLI fine-tuning instructions soon.
thanks! Looking forward to it.
Hey, we will release XNLI fine-tuning instructions soon.
I am using the same format as BERT. My result is 0.828 for En, 0.732 for Zh. Using XLMR-base, 4 epoch, learning rate 2e-5, batch size 16. Could you please give the hyperparms for reproduce the results published in the paper?
@kartikayk Can you please share above details?
@tomking1988 I'm guessing you're talking about the zero-shot setting here. Following is the set up we used for the numbers published in the paper:
closing after @kartikayk 's answer
Most helpful comment
@tomking1988 I'm guessing you're talking about the zero-shot setting here. Following is the set up we used for the numbers published in the paper: