I just installed the library on a TensorFlow environment (2.0.0-rc1) and there is no BertModel in transformers.
Is TFBertModel equivalent? If so, then I get the error TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType when loading the model with model = TFBertModel.from_pretrained('neuralmind/bert-base-portuguese-cased').
transformers version: 2.4.1BertModel is the pytorch model, and is therefore only available if you have torch installed. As you correctly said, TFBertModel is the TensorFlow equivalent.
Importing with from transformers import TFBertModel raises the above error?
Loading the model gives me the error: TFBertModel.from_pretrained('neuralmind/bert-base-portuguese-cased')
This model is only available in PyTorch, Neuralmind has not provided a TensorFlow checkpoint for that model. You can see it on the page, as it has the tag PyTorch, but no TensorFlow tag.
You can still load it in TensorFlow, but you have to add the from_pt flag:
from transformers import TFBertModel
TFBertModel.from_pretrained('neuralmind/bert-base-portuguese-cased', from_pt=True)
This might require you to have PyTorch installed to do the conversion.
Thank you, but with that I get the error OSError: Loading a TF model from a PyTorch checkpoint is not supported when using a model identifier name..
I did install PyTorch.
Hi, i too have problem importing bert model error:
File "chatbot.py", line 54, in models
bert_model = TFBertModel.from_pretrained('bert-base-uncased')
File "C:\Users\CHENG\AppData\Local\Programs\Python\Python37\lib\site-packages\transformers\modeling_tf_utils.py", line 351, in from_pretrained
assert os.path.isfile(resolved_archive_file), "Error retrieving file {}".format(resolved_archive_file)
File "C:\Users\CHENG\AppData\Local\Programs\Python\Python37\lib\genericpath.py", line 30, in isfile
st = os.stat(path)
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType
sometimes it works, sometimes it throws this error, i don't know why, any help will be appreciated!!
@rodrigoruiz, indeed, this functionality was added 12 days ago with https://github.com/huggingface/transformers/commit/961c69776f8a2c95b92407a086848ebca037de5d, so it wouldn't be available on the pip version of 2.4.1. My bad.
Would you try installing from source with pip install git+https://github.com/huggingface/transformers and let me know if it fixes your issue?
@LysandreJik Thank you, that worked!
Hi, i too have problem importing bert model error:
File "chatbot.py", line 54, in models bert_model = TFBertModel.from_pretrained('bert-base-uncased') File "C:\Users\CHENG\AppData\Local\Programs\Python\Python37\lib\site-packages\transformers\modeling_tf_utils.py", line 351, in from_pretrained assert os.path.isfile(resolved_archive_file), "Error retrieving file {}".format(resolved_archive_file) File "C:\Users\CHENG\AppData\Local\Programs\Python\Python37\lib\genericpath.py", line 30, in isfile st = os.stat(path) TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneTypesometimes it works, sometimes it throws this error, i don't know why, any help will be appreciated!!
I have the same problem with TFXLMRobertaModel.from_pretrained("xlm-roberta-base"), did you solve it?
Hi @Riccorl my problem somehow just disappear after restarting and upgrading tensorflow to 2.1.0. I鈥檓 not sure how it is solved. Initially, the error pops up randomly, meaning sometimes it works smoothly sometimes not. But I have no error now at all.
Maybe do a pip install -U transformers
And then pip install -U tensorflow-gpu
Hi @Riccorl my problem somehow just disappear after restarting and upgrading tensorflow to 2.1.0. I鈥檓 not sure how it is solved. Initially, the error pops up randomly, meaning sometimes it works smoothly sometimes not. But I have no error now at all.
Maybe do a
pip install -U transformers
And thenpip install -U tensorflow-gpu
It seems like i have problem only with xlm-roberta tensorflow models. Other models work. Maybe I should open a new issue
I had the same error with this
model = TFBertModel.from_pretrained('bert-base-uncased')
File "/home/cally/.local/lib/python3.7/site-packages/transformers/modeling_tf_utils.py", line 403, in from_pretrained
assert os.path.isfile(resolved_archive_file), "Error retrieving file {}".format(resolved_archive_file)
File "/usr/local/lib/python3.7/genericpath.py", line 30, in isfile
st = os.stat(path)
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType
this is my code
model = TFBertModel.from_pretrained('bert-base-uncased')
did anyone solve it
sometimes it works, sometimes it appears error
Hi @Riccorl my problem somehow just disappear after restarting and upgrading tensorflow to 2.1.0. I鈥檓 not sure how it is solved. Initially, the error pops up randomly, meaning sometimes it works smoothly sometimes not. But I have no error now at all.
Maybe do a
pip install -U transformers
And thenpip install -U tensorflow-gpu
Installing above packages solved this issue for me. Its working fine now. Thanks @nixon-nyx
I guess this can now be closed
@daraksha-shirin you鈥檙e welcome! Glad that I could help!
I guess this can now be closed
Yep.
I had the same error with this
model = TFBertModel.from_pretrained('bert-base-uncased') File "/home/cally/.local/lib/python3.7/site-packages/transformers/modeling_tf_utils.py", line 403, in from_pretrained assert os.path.isfile(resolved_archive_file), "Error retrieving file {}".format(resolved_archive_file) File "/usr/local/lib/python3.7/genericpath.py", line 30, in isfile st = os.stat(path) TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneTypethis is my code
model = TFBertModel.from_pretrained('bert-base-uncased')did anyone solve it
I'm still having the exact same issue when fine-tuning model with TFAutoModel with following packages version:
tensorflow: 2.2.0transformers: 3.0.2
Most helpful comment
@daraksha-shirin you鈥檙e welcome! Glad that I could help!