Transformers: load_tf_weights_in_bert : 'BertModel' object has no attribute 'bias'

Created on 1 Mar 2020  路  2Comments  路  Source: huggingface/transformers

AttributeError                            Traceback (most recent call last)
<ipython-input-14-0d66155b396d> in <module>
     12 
     13         K.clear_session()
---> 14         model = create_model()
     15         optimizer = tf.keras.optimizers.Adam(learning_rate=2e-5)
     16         model.compile(loss='categorical_crossentropy', optimizer=optimizer, metrics=['acc', 'mae'])

<ipython-input-13-4f429fe61419> in create_model()
      5     config = BertConfig.from_pretrained(BERT_PATH + 'bert_config.json')
      6     config.output_hidden_states = False
----> 7     bert_model = BertModel.from_pretrained(BERT_PATH + 'bert_model.ckpt.index', from_tf=True, config=config)
      8     # if config.output_hidden_states = True, obtain hidden states via bert_model(...)[-1]
      9     embedding = bert_model(input_id, attention_mask=input_mask, token_type_ids=input_atn)[0]

~/anaconda3/envs/fasterai/lib/python3.7/site-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    482             if resolved_archive_file.endswith(".index"):
    483                 # Load from a TensorFlow 1.X checkpoint - provided by original authors
--> 484                 model = cls.load_tf_weights(model, config, resolved_archive_file[:-6])  # Remove the '.index'
    485             else:
    486                 # Load from our TensorFlow 2.0 checkpoints

~/anaconda3/envs/fasterai/lib/python3.7/site-packages/transformers/modeling_bert.py in load_tf_weights_in_bert(model, config, tf_checkpoint_path)
    103                 pointer = getattr(pointer, "weight")
    104             elif scope_names[0] == "output_bias" or scope_names[0] == "beta":
--> 105                 pointer = getattr(pointer, "bias")
    106             elif scope_names[0] == "output_weights":
    107                 pointer = getattr(pointer, "weight")

~/anaconda3/envs/fasterai/lib/python3.7/site-packages/torch/nn/modules/module.py in __getattr__(self, name)
    574                 return modules[name]
    575         raise AttributeError("'{}' object has no attribute '{}'".format(
--> 576             type(self).__name__, name))
    577 
    578     def __setattr__(self, name, value):

AttributeError: 'BertModel' object has no attribute 'bias'

related libs&version:
transformers 2.5.1
tensorflow 2.1.0

environment:
NVIDIA-SMI 440.59 Driver Version: 440.59 CUDA Version: 10.2

Modeling

Most helpful comment

change
BertModel.form_pretrained
to
BertForPreTraining.from_pretrained
it seems to work

All 2 comments

change
BertModel.form_pretrained
to
BertForPreTraining.from_pretrained
it seems to work

Glad you could get it to work! Indeed, BertForPreTraining should be used to convert from official BERT models.

Was this page helpful?
0 / 5 - 0 ratings