Transformers: pip install [--editable] . ---> Error

Created on 1 Mar 2019  Â·  21Comments  Â·  Source: huggingface/transformers

Hi, when using "pip install [--editable] . ", after cloned the git.
I'm getting this error:
Exception:
Traceback (most recent call last):
File "/venv/lib/python3.5/site-packages/pip/_vendor/packaging/requirements.py", line 93, in __init__
req = REQUIREMENT.parseString(requirement_string)
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 1814, in parseString
raise exc
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 1804, in parseString
loc, tokens = self._parse( instring, 0 )
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 1548, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 3722, in parseImpl
loc, exprtokens = e._parse( instring, loc, doActions )
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 1552, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/venv/lib/python3.5/site-packages/pip/_vendor/pyparsing.py", line 3502, in parseImpl
raise ParseException(instring, loc, self.errmsg, self)
pip._vendor.pyparsing.ParseException: Expected stringEnd (at char 11), (line:1, col:12)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/venv/lib/python3.5/site-packages/pip/_internal/cli/base_command.py", line 179, in main
status = self.run(options, args)
File "/venv/lib/python3.5/site-packages/pip/_internal/commands/install.py", line 289, in run
self.name, wheel_cache
File "/venv/lib/python3.5/site-packages/pip/_internal/cli/base_command.py", line 269, in populate_requirement_set
wheel_cache=wheel_cache
File "/venv/lib/python3.5/site-packages/pip/_internal/req/constructors.py", line 280, in install_req_from_line
extras = Requirement("placeholder" + extras_as_string.lower()).extras
File "/venv/lib/python3.5/site-packages/pip/_vendor/packaging/requirements.py", line 97, in __init__
requirement_string[e.loc : e.loc + 8], e.msg
pip._vendor.packaging.requirements.InvalidRequirement: Parse error at "'[--edita'": Expected stringEnd

Did someone saw anything like that? any idea?

Most helpful comment

The pip install -e . is probably working, it's just that some tests are failing due to code not tests on Torch 1.2.0.

The install should have worked fine, and you should be fine with using every component in the library with torch 1.2.0 except the decoder architectures on which we are working now. Updating to torch 1.3.0 means it will work with decoder architectures too.

All 21 comments

Did you run python install --editable .

There's a way to install cloned repositories with pip, but the easiest way is to use plain python for this:

After cloning and changing into the pytorch-pretrained-BERT directory, run python setup.py develop.

Yes, please follow the installation instructions on the readme here

@thomwolf
I have exactly the same problem after following readme installation (mentioned). I am using pytorch.

python -m pytest -sv ./transformers/tests/ have two failed tests.

transformers/tests/modeling_bert_test.py::BertModelTest::test_bert_model PASSED
transformers/tests/modeling_bert_test.py::BertModelTest::test_bert_model_as_decoder FAILED
transformers/tests/modeling_bert_test.py::BertModelTest::test_config PASSED
transformers/tests/modeling_bert_test.py::BertModelTest::test_determinism PASSED
transformers/tests/modeling_bert_test.py::BertModelTest::test_for_masked_lm PASSED
transformers/tests/modeling_bert_test.py::BertModelTest::test_for_masked_lm_decoder FAILED
transformers/tests/modeling_bert_test.py::BertModelTest::test_for_multiple_choice PASSED

======================================================= 2 failed, 403 passed, 227 skipped, 36 warnings in 49.14s ======================================================

@bheinzerling,
python setup.py develop can go through ok. But the test result is the same as above: two are two failed tests.

Anybody know why "pip install [--editable] ." failed here? It is some missing python package needed for this?

Please open a command line and enter pip install git+https://github.com/huggingface/transformers.git for installing Transformers library from source. However, Transformers v-2.2.0 has been just released yesterday and you can install it from PyPi with pip install transformers

Try to install this latest version and launch the tests suite and keep us updated on the result!

Anybody know why "pip install [--editable] ." failed here? It is some missing python package needed for this?

@TheEdoardo93
This is indeed the latest version installed( installed a few hours before)

Name: transformers
Version: 2.2.0
Summary: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch
Home-page: https://github.com/huggingface/transformers
Author: Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors
Author-email: [email protected]
License: Apache
Location: /home/pcl/venvpytorch/lib/python3.6/site-packages
Requires: sacremoses, numpy, requests, boto3, regex, tqdm, sentencepiece
Required-by:

@TheEdoardo93
After uninstall and reinstall with pip install git+https://github.com/huggingface/transformers.git.
Still the same results as before (two are failed)

======================================================= 2 failed, 403 passed, 227 skipped, 36 warnings in 49.13s =======

Name: transformers
Version: 2.2.0
Summary: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch
Home-page: https://github.com/huggingface/transformers
Author: Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors
Author-email: [email protected]
License: Apache
Location: /home/pcl/venvpytorch/opensource/transformers
Requires: numpy, boto3, requests, tqdm, regex, sentencepiece, sacremoses
Required-by:

When I've executed python -m pytest -sv ./transformers/tests/, I've obtained the following result: 595 passed, 37 skipped, 36 warnings in 427.58s (0:07:07).
When I've executed python -m pytest -sv ./examples/, I've obtained the following result: 15 passed, 7 warnings in 77.09s (0:01:17).

@TheEdoardo93
After uninstall and reinstall with pip install git+https://github.com/huggingface/transformers.git.
Still the same results as before (two are failed)

======================================================= 2 failed, 403 passed, 227 skipped, 36 warnings in 49.13s =======

Name: transformers
Version: 2.2.0
Summary: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch
Home-page: https://github.com/huggingface/transformers
Author: Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors
Author-email: [email protected]
License: Apache
Location: /home/pcl/venvpytorch/opensource/transformers
Requires: numpy, boto3, requests, tqdm, regex, sentencepiece, sacremoses
Required-by:

I did not install TensorFlow which is the reason for skips. I need reasons for failure. I guess I will install TensorFlow and see how it goes.

In the README.md file, Transformers' authors says to install TensorFlow 2.0 and PyTorch 1.0.0+ before installing Transformers library.

I did not install TensorFlow which is the reason for skips. I need reasons for failure. I guess I will install TensorFlow and see how it goes.

"First you need to install one of, or both, TensorFlow 2.0 and PyTorch." I don't think that is the reason for failure.

Hi, I believe these two tests fail with an error similar to:

 RuntimeError: expected device cpu and dtype Long but got device cpu and dtype Bool

If I'm not mistaken you're running with torch 1.2 and we're testing with torch 1.3. This is a bug as we aim to support torch from 1.0.1+. Thank you for raising the issue, you can fix it by installing torch 1.3+ while we work on fixing this.

Thanks! Yeah, I found it too by verbose mode. I suddenly remember some
tensorflow code have similar problem before. In my case,it is some const,
I just changed it from int to float. Indeed I am using torch1.2. Will
see whether it works here or not. Any idea why the pip -e option is
not working?

On Wed, Nov 27, 2019 at 22:49 Lysandre Debut notifications@github.com
wrote:r

Hi, I believe these two tests fail with an error similar to:

RuntimeError: expected device cpu and dtype Long but got device cpu and dtype Bool

If I'm not mistaken you're running with torch 1.2 and we're testing with
torch 1.3. This is a bug as we aim to support torch from 1.0.1+. Thank you
for raising the issue, you can fix it by installing torch 1.3+ while we
work on fixing this.

—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/huggingface/transformers/issues/334?email_source=notifications&email_token=AA6O5IG4IUK6Z3ESWAIYOXLQV2CJDA5CNFSM4G3CE3DKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFJXR4I#issuecomment-559118577,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AA6O5IFKBX3QB5AVMTXA5P3QV2CJDANCNFSM4G3CE3DA
.

The pip install -e . is probably working, it's just that some tests are failing due to code not tests on Torch 1.2.0.

The install should have worked fine, and you should be fine with using every component in the library with torch 1.2.0 except the decoder architectures on which we are working now. Updating to torch 1.3.0 means it will work with decoder architectures too.

1.3 torch must work with cuda10.1? I have 10.0 for tensorflow which is
still having problem with 10.1. Thanks for the info. Really appreciate ur
fast response!

On Wed, Nov 27, 2019 at 23:23 Lysandre Debut notifications@github.com
wrote:

The pip install -e . is probably working, it's just that some tests are
failing due to code not tests on Torch 1.2.0.

The install should have worked fine, and you should be fine with using
every component in the library with torch 1.2.0 except the decoder
architectures on which we are working now. Updating to torch 1.3.0 means it
will work with decoder architectures too.

—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/huggingface/transformers/issues/334?email_source=notifications&email_token=AA6O5ICNJ4IRK65JEA6X2DTQV2GIBA5CNFSM4G3CE3DKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFJ3AOQ#issuecomment-559132730,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AA6O5IDZATDEY7PA5YMYF6TQV2GIBANCNFSM4G3CE3DA
.

In the official PyTorch documentation, in the installation section, you can see that you can install PyTorch 1.3 with CUDA 9.2 or CUDA 10.1, so PyTorch 1.3 + CUDA 10.1 works!

1.3 torch must work with cuda10.1? I have 10.0 for tensorflow which is still having problem with 10.1. Thanks for the info. Really appreciate ur fast response!
…
On Wed, Nov 27, 2019 at 23:23 Lysandre Debut @.*> wrote: The pip install -e . is probably working, it's just that some tests are failing due to code not tests on Torch 1.2.0. The install should have worked fine, and you should be fine with using every component in the library with torch 1.2.0 except the decoder architectures on which we are working now. Updating to torch 1.3.0 means it will work with decoder architectures too. — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#334?email_source=notifications&email_token=AA6O5ICNJ4IRK65JEA6X2DTQV2GIBA5CNFSM4G3CE3DKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFJ3AOQ#issuecomment-559132730>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA6O5IDZATDEY7PA5YMYF6TQV2GIBANCNFSM4G3CE3DA .

What is the difference between the following?

  • pip install [--editable] .
  • pip install -e .
  • python setup.py develop

The first works doesn't work for me, yet is in the readme. The other two do. If this is system-dependent, shouldn't this be added to the readme?

@internetcoffeephone, using square brackets in a command line interface is a common way to refer to optional parameters. The first command means that you can either use pip install . or pip install --editable .

@LysandreJik That makes sense, thanks for your answer!

Still, I'd argue against putting it in the readme like that. Firstly because it doesn't produce a sensible error message - secondly because anyone who wants an editable installation will know about that optional parameter already.

As for the difference between the above commands, I found this page:

Try to avoid calling setup.py directly, it will not properly tell pip that you've installed your package.
With pip install -e:

For local projects, the “SomeProject.egg-info” directory is created relative to the project path. This is one advantage over just using setup.py develop, which creates the “egg-info” directly relative the current working directory.

I removed [--editable] from the instructions because I found them confusing (before stumbling upon this issue).

Was this page helpful?
0 / 5 - 0 ratings