Transformers: Convert_tf_checkpoint_to_pytorch for bert-joint-baseline

Created on 4 Apr 2019  路  6Comments  路  Source: huggingface/transformers

Hello,

I know that the format of Squad and Google NQ is different, but is there a way to convert the bert joint model for Natural Questions (https://github.com/google-research/language/tree/master/language/question_answering/bert_joint) to pytorch? I get this error
'BertForPreTraining' object has no attribute 'answer_type_output_bias'

Discussion Help wanted wontfix

All 6 comments

I also encountered a similar problem AttributeError: 'BertForPreTraining' object has no attribute 'crf_loss'

@thomwolf

Looking forward to your reply

Hi, from my reading of the Natural Questions model it doesn't seems to be directly possible to load this model in the current library (with simple hacks).

You will need to define a new sub-class of BertModel (e.g. BertForNaturalQA) that reproduce the architecture of the TensorFlow model I pointed to. If you use the same name as the TensorFlow variables for the attributes of your PyTorch model you should be able to load the model with the current loading script.

I don't have time to do this right now but if you want to start opening a PR, I can review it.

Basically, just add another class after the BertForQuestionAnswering class in modeling.py

Thanks a lot for the reply. Will try this out.

Best,

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Hi @raheja, do you have any updates related to this? I'm trying to do the same and would welcome any hint!

Hi @paulachocron
Haven't been able to try or implement this yet. I am still using the tf version for now.

Was this page helpful?
0 / 5 - 0 ratings