Hi @singletongue,
I am trying to use Question-Answering for Japanese, however I could not find any model trained for that.
I tried with the available models but the results were way off (as expected...).
Any suggestions on available models, or other library that already handle QnA with Japanese?
If it is supposed to work as-is, could you share a simple example?
Thank you in advance!
Hi @Mukei,
As far as I know, there is no Transformer-based model fine-tuned for Japanese question answering tasks.
It is partly due to the scarcity of Japanese QA datasets (like SQuAD) to train the models on.
(Of course, we do wish to release models for QA, and it is left for our future work.)
As a workaround you could load the bert-base-japanese weights for the BertForQuestionAnswering model and just finetune the qa_outputs layer (in case of a single span prediction task). It will be quickly trained and maybe produces already sufficient results.
@cronoik Thank you for the advice.
I tried but unfortunately the results were pretty bad even for some simple phrase.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
As a workaround you could load the bert-base-japanese weights for the BertForQuestionAnswering model and just finetune the qa_outputs layer (in case of a single span prediction task). It will be quickly trained and maybe produces already sufficient results.