Transformers: tensorflow bert model can鈥榯 return all hidden_states

Created on 25 Jul 2020  路  7Comments  路  Source: huggingface/transformers

import tensorflow as tf
import tensorflow_datasets
from transformers import *
from tensorflow.keras import layers

configuration = BertConfig.from_pretrained('bert-base-cased', output_hidden_states=True)
# Load dataset, tokenizer, model from pretrained model/vocabulary
tokenizer = BertTokenizer.from_pretrained('bert-base-cased', config=configuration)
# BERT encoder
encoder = TFBertModel.from_pretrained('bert-base-cased', config=configuration)
# Model
input_ids = layers.Input(shape=(100,), dtype=tf.int32)
token_type_ids = layers.Input(shape=(100,), dtype=tf.int32)
attention_mask = layers.Input(shape=(100,), dtype=tf.int32)
outputs = encoder(
    input_ids, token_type_ids=token_type_ids, attention_mask=attention_mask
)
print(outputs)

_, _, hidden_states = outputs[0], outputs[1], outputs[2]

output:

If your task is similar to the task the model of the ckeckpoint was trained on, you can already use TFBertModel for predic
(<tf.Tensor 'tf_bert_model_2/Identity:0' shape=(None, 100, 768) dtype=float32>, <tf.Tensor 'tf_bert_model_2/Identity_1:0' shape=(None, 768) dtype=float32>)


---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
<ipython-input-7-63944e137dcd> in <module>()
     17 )
     18 print(outputs)
---> 19 _, _, hidden_states = outputs[0], outputs[1], outputs[2]

IndexError: tuple index out of range

You can check it:
colab code

wontfix

Most helpful comment

I'm encountering the exact same error - and it also happens when trying to output the attention keys.
I would really like to know where the problem comes from as well!

All 7 comments

I'm encountering the exact same error - and it also happens when trying to output the attention keys.
I would really like to know where the problem comes from as well!

@sshleifer

@julien-c

transformers==2.7.0 solved the issue in my case

transformers==2.7.0 solved the issue in my case

Thanks a lot. It solved my case too

Thank you so much @VietHoang1710, worked for me - The same problem with tensorflow roberta model, lost a lot of time to this one!

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings