Model I am using (Bert, XLNet ...): RobertaForMaskedLM
Language I am using the model on (English, Chinese ...): English
The problem arises when using:
The tasks I am working on is:
Steps to reproduce the behavior:
run_language_modeling.py, which is the example script
Error message:
Traceback (most recent call last): File "run_language_modeling.py", line 42, in
from transformers import (
ImportError: cannot import name 'MODEL_WITH_LM_HEAD_MAPPING'
The script should run..
transformers version: 2.5.1The work around is to use
from transformers.modeling_auto import MODEL_WITH_LM_HEAD_MAPPING
from transformers.file_utils import WEIGHTS_NAME
Can you please update the example script? It is confusing ...
You need to upgrade your version of transformers (to 2.6), or better, to install from source.
I just pulled the huggingface/transformers-tensorflow-gpu:2.10.0 docker image, went to the examples/language-modeling/ folder and ran the following, and I got the same error:
python3 run_language_modeling.py --output_dir=/app/data --model_type=distilbert --model_name_or_path=distilbert-base-uncased --do_train --train_data_file=/app/data/train_data.txt --do_eval --eval_data_file=/app/data/eval_data.txt --mlm
Haven't tried the workaround above yet.
Steps:
docker run -it -vpwd/data:/app/data huggingface/transformers-tensorflow-gpu:2.10.0cd workspace/examples/language-modeling/python3python3 -m pip show transformers reports 2.10.0 is installed.
I get the issue (the master branch being checked out in the docker build) it just seems like it'd be cool for there to be a simpler way to run the examples in docker. If you wanted to use the 2.9.0 image, you'd have to pull the image and have your script first check out master as of the tag 2.9.0 then install from source, right?
It'd be a nice feature if the docker images could run the examples without modification
I get the same issue when I pip install transformers. When I downgrade to 2.6.0, it can't import CONFIG_MAPPING. Anything from 2.7.0 to 2.10.0 up I get the MODEL_WITH_LM_HEAD_MAPPING error
Okay, I got it to work for 2.10.0. I just had to reinstall PyTorch
pip3 install torch
Most helpful comment
You need to upgrade your version of transformers (to 2.6), or better, to install from source.