Transformers: Installation Error - Failed building wheel for tokenizers

Created on 12 Feb 2020  路  30Comments  路  Source: huggingface/transformers

馃悰 Bug

Information

Model I am using (Bert, XLNet ...): N/A

Language I am using the model on (English, Chinese ...): N/A

The problem arises when using:

  • [X] the official example scripts: (give details below)

Problem arises in transformers installation on Microsoft Windows 10 Pro, version 10.0.17763

After creating and activating the virtual environment, installing transformers is not possible, because the following error occurs:

"error: can not find Rust Compiler"
"ERROR: Failed building wheel for tokenizers"
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed d

The tasks I am working on is:
[X ] transformers installation

To reproduce

Steps to reproduce the behavior:

  1. From command line interface, create and activate a virtual environment by following the steps in this URL: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
  2. Install transformers from source, by following the example in the topic From Source on this URL: https://github.com/huggingface/transformers
-m pip --version
-m pip install --upgrade pip
-m pip install --user virtualenv
-m venv env
.\env\Scripts\activate
pip install transformers

ERROR: Command errored out with exit status 1:
   command: 'c:\users\vbrandao\env\scripts\python.exe' 'c:\users\vbrandao\env\lib\site-packages\pip\_vendor\pep517\_in_process.py' build_wheel 'C:\Users\vbrandao\AppData\Local\Temp\tmpj6evjmze'
       cwd: C:\Users\vbrandao\AppData\Local\Temp\pip-install-sza2_lmj\tokenizers
  Complete output (10 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib
  creating build\lib\tokenizers
  copying tokenizers\__init__.py -> build\lib\tokenizers
  running build_ext
  running build_rust
  error: Can not find Rust compiler
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly

Expected behavior

Installation of transformers should be complete.

Environment info

  • transformers version: N/A - installation step
  • Platform: Command Line Interface / Virtual Env
  • Python version: python 3.8
  • PyTorch version (GPU?): N/A
  • Tensorflow version (GPU?): N/A
  • Using GPU in script?: N/A
  • Using distributed or parallel set-up in script?: N/A
    tokenizers_intallation_error
Tokenization Installation wontfix

Most helpful comment

I was having the same issue on virtualenv over Mac OS Mojave. Managed to solve it and install Transformers 2.5.1 by manually install the last version of tokenizers (0.6.0) instead of 0.5.2 that is required in the transformer package.

pip install tokenizers

Git clone latest version of transformers:

git clone https://github.com/huggingface/transformers

Before running the installation edit transformers/setup.py and change requirement of tokenizers to 0.6.0

Line 93: install_requires=[
"numpy",
"tokenizers == 0.6.0",

Then run as usual:

cd transformers
pip install .

I assume that you could also skip the first step and just collect the package as you run the install.
I'm quite new to this, so just wanted to share my take.

All 30 comments

Having the exact same issue on a Linux machine!

Environment: macOS Mojave Ver 10.14.6
Tried installing both from pip and source. Same issue:

Successfully built transformers
Failed to build tokenizers

Result was that Transformers was not installed (not listed in pip freeze)

This however should work - seems like you just won't get the the new tokenizers:
pip install transformers==2.4.1

@GDBSD I had the same issue on the same OS version and also tried pip and source. Your version specification worked.

Had the same issue on MacOS Mojave when doing pip3 install. Tried pip2 install, it worked but I got another error when running my script telling me I should really be using python 3.

I tried @GDBSD 's answer, but I got this error:

ERROR: Exception:
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/py_compile.py", line 143, in compile
    _optimize=optimize)
  File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/private/var/folders/g0/5zwy4mtx7579v5x6rxqb083r0000gn/T/pip-unpacked-wheel-k410h9s0/sacremoses/sent_tokenize.py", line 69
    if re.search(IS_EOS, token)
                              ^
SyntaxError: invalid syntax

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/compileall.py", line 159, in compile_file
    invalidation_mode=invalidation_mode)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/py_compile.py", line 147, in compile
    raise py_exc
py_compile.PyCompileError:   File "/private/var/folders/g0/5zwy4mtx7579v5x6rxqb083r0000gn/T/pip-unpacked-wheel-k410h9s0/sacremoses/sent_tokenize.py", line 69
    if re.search(IS_EOS, token)
                              ^
SyntaxError: invalid syntax


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/cli/base_command.py", line 186, in _main
    status = self.run(options, args)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/commands/install.py", line 404, in run
    use_user_site=options.use_user_site,
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/req/__init__.py", line 71, in install_given_reqs
    **kwargs
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/req/req_install.py", line 815, in install
    warn_script_location=warn_script_location,
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/operations/install/wheel.py", line 614, in install_wheel
    warn_script_location=warn_script_location,
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/operations/install/wheel.py", line 338, in install_unpacked_wheel
    compileall.compile_dir(source, force=True, quiet=True)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/compileall.py", line 97, in compile_dir
    legacy, optimize, invalidation_mode):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/compileall.py", line 169, in compile_file
    msg = err.msg.encode(sys.stdout.encoding,
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/pip/_internal/utils/misc.py", line 554, in encoding
    return self.orig_stream.encoding
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/codecs.py", line 409, in __getattr__
    return getattr(self.stream, name)
AttributeError: '_io.BufferedWriter' object has no attribute 'encoding'

yes I had the same issue with pip3.6 install

Can you all run python transformers-cli env and post the output here? It provides some useful information about your platform that might be helpful to debug.

Hi, I had the same problem and resolved it by installing rust.
"error: Can not find Rust compiler"

For MacOS, I used "curl https://sh.rustup.rs -sSf | sh". I also found that it needed a nightly version of rust, so you have to specify that in the install options.

Hi, I also had the same problem with my initial installation of the library. After some time, I realized that my anaconda version was on 32Bit. You can check your version with
python -c "import struct;print( 8 * struct.calcsize('P'))"
The output should be 64.
If it is 32 then you have to reinstall your IDE

@Wild3d I can confirm after running your snippet that I am on a 64bit version

@gardnerds after creating a new environment to try your solution that also worked for me. I didn't have rust installed before. It successfully built the wheel for tokenizers (PEP 517).

@gardnerds also worked for me. Using python 3.7 and built from source using a clean conda env

Install Python 64-bit instead of 32-bit solved my same issue.

I was having the same issue on virtualenv over Mac OS Mojave. Managed to solve it and install Transformers 2.5.1 by manually install the last version of tokenizers (0.6.0) instead of 0.5.2 that is required in the transformer package.

pip install tokenizers

Git clone latest version of transformers:

git clone https://github.com/huggingface/transformers

Before running the installation edit transformers/setup.py and change requirement of tokenizers to 0.6.0

Line 93: install_requires=[
"numpy",
"tokenizers == 0.6.0",

Then run as usual:

cd transformers
pip install .

I assume that you could also skip the first step and just collect the package as you run the install.
I'm quite new to this, so just wanted to share my take.

@dafraile That solves mine! Thank you very much!

@dafraile That helps, thanks a lot!

I managed to solve the issue by installing Rust compiler

  • Install Rust link curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  • Restart the terminal
  • pip install transformers==2.5.1

Environment: macOS Mojave Ver 10.14.6
Tried installing both from pip and source. Same issue:

Successfully built transformers
Failed to build tokenizers

Result was that Transformers was not installed (not listed in pip freeze)

This however should work - seems like you just won't get the the new tokenizers:
pip install transformers==2.4.1

This solution is working for me

I managed to solve the issue by installing Rust compiler

  • Install Rust link curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  • Restart the terminal
  • pip install transformers==2.5.1

It works for me, thanks!
You can do source $HOME/.cargo/env instead of restarting the terminal.

@gardnerds, adding $HOME/.cargo/bin to PATH after installing rust fixed my installation. Thank you.

@dafraile Thanks a lot. It solves my problem

@dafraile Thanks! It works!

@AvivNavon Thanks ! Solved my problem too. (MacOS Mojave)
I install latest version of transformers though (2.8.0)
pip install transformers instead of pip install transformers==2.5.1

resolved this issue by installing Rust

I resolved this issue by installing Rust - I initially did forget to restart the terminal first.
I'm using Mojave 10.14.5.
This thread is great! Btw I had no such issues on my Ubuntu 18.04 machine.

@phihung recommendation works.

Just installing rust compiler works for me too (Thanks @phihung ) I'm on Mac Mojave 10.14.6.
May be conda installation should be able to over come this? (don't know if pip can force install a 3rd party compiler)?

@dafraile Actually your solution is the closest one ! But now I saw that they just corrected that line in setup.py so it became tokenizers==0.7.0 now (and the newest tokenizers are 0.7.0).
So the real importance is that we should

  1. always update the transformers from the source
  2. (really important !) uninstall the old version before we reinstall the newest :p

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

I am facing a similar issue trying to build on a PowerPC with RedHat
I am getting errors when trying to build tokenizers:

Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (PEP 517) ... error
  ERROR: Command errored out with exit status 1:
   command: /home/aarbelle/.conda/envs/gbs/bin/python3.6 /home/aarbelle/.conda/envs/gbs/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /tmp/tmpd6q9xccz
       cwd: /tmp/pip-install-ohxny31i/tokenizers
  Complete output (136 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib
  creating build/lib/tokenizers
  copying tokenizers/__init__.py -> build/lib/tokenizers
  creating build/lib/tokenizers/models
  copying tokenizers/models/__init__.py -> build/lib/tokenizers/models
  creating build/lib/tokenizers/decoders
  copying tokenizers/decoders/__init__.py -> build/lib/tokenizers/decoders
  creating build/lib/tokenizers/normalizers
  copying tokenizers/normalizers/__init__.py -> build/lib/tokenizers/normalizers
  creating build/lib/tokenizers/pre_tokenizers
  copying tokenizers/pre_tokenizers/__init__.py -> build/lib/tokenizers/pre_tokenizers
  creating build/lib/tokenizers/processors
  copying tokenizers/processors/__init__.py -> build/lib/tokenizers/processors
  creating build/lib/tokenizers/trainers
  copying tokenizers/trainers/__init__.py -> build/lib/tokenizers/trainers
  creating build/lib/tokenizers/implementations
  copying tokenizers/implementations/bert_wordpiece.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/__init__.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/byte_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/base_tokenizer.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/char_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/__init__.pyi -> build/lib/tokenizers
  copying tokenizers/models/__init__.pyi -> build/lib/tokenizers/models
  copying tokenizers/decoders/__init__.pyi -> build/lib/tokenizers/decoders
  copying tokenizers/normalizers/__init__.pyi -> build/lib/tokenizers/normalizers
  copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib/tokenizers/pre_tokenizers
  copying tokenizers/processors/__init__.pyi -> build/lib/tokenizers/processors
  copying tokenizers/trainers/__init__.pyi -> build/lib/tokenizers/trainers
  running build_ext
  running build_rust
      Updating crates.io index
      Updating git repository `https://github.com/n1t0/rayon-cond`
  warning: unused manifest key: target.x86_64-apple-darwin.rustflags
     Compiling proc-macro2 v1.0.21
     Compiling unicode-xid v0.2.1
     Compiling autocfg v1.0.1
     Compiling syn v1.0.41
     Compiling libc v0.2.77
     Compiling lazy_static v1.4.0
     Compiling cfg-if v0.1.10
     Compiling memchr v2.3.3
     Compiling serde_derive v1.0.116
     Compiling scopeguard v1.1.0
     Compiling serde v1.0.116
     Compiling maybe-uninit v2.0.0
     Compiling regex-syntax v0.6.18
     Compiling ryu v1.0.5
     Compiling rayon-core v1.8.1
     Compiling getrandom v0.1.15
     Compiling serde_json v1.0.57
     Compiling smallvec v1.4.2
     Compiling itoa v0.4.6
     Compiling inventory v0.1.9
     Compiling pkg-config v0.3.18
     Compiling proc-macro-hack v0.5.18
     Compiling bitflags v1.2.1
     Compiling cc v1.0.60
     Compiling unicode-width v0.1.8
     Compiling either v1.6.1
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/proc-macro2-1.0.21/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="proc-macro"' -C metadata=93385cb1e678e330 -C extra-filename=-93385cb1e678e330 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/proc-macro2-93385cb1e678e330 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name unicode_xid /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/unicode-xid-0.2.1/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' -C metadata=cac161967aa527e1 -C extra-filename=-cac161967aa527e1 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name autocfg /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/autocfg-1.0.1/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=ddb9624730d1e52a -C extra-filename=-ddb9624730d1e52a --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/syn-1.0.41/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="clone-impls"' --cfg 'feature="default"' --cfg 'feature="derive"' --cfg 'feature="extra-traits"' --cfg 'feature="full"' --cfg 'feature="parsing"' --cfg 'feature="printing"' --cfg 'feature="proc-macro"' --cfg 'feature="quote"' --cfg 'feature="visit"' -C metadata=9988fc7a157e69c9 -C extra-filename=-9988fc7a157e69c9 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/syn-9988fc7a157e69c9 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/libc-0.2.77/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="std"' -C metadata=5a4798f2b06c36bd -C extra-filename=-5a4798f2b06c36bd --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/libc-5a4798f2b06c36bd -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name cfg_if --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/cfg-if-0.1.10/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=a7dbefe7725970f6 -C extra-filename=-a7dbefe7725970f6 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name lazy_static /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/lazy_static-1.4.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=09f05f31cfc64306 -C extra-filename=-09f05f31cfc64306 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/memchr-2.3.3/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="std"' --cfg 'feature="use_std"' -C metadata=a8f56f28f9bbd928 -C extra-filename=-a8f56f28f9bbd928 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/memchr-a8f56f28f9bbd928 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/serde_derive-1.0.116/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' -C metadata=d850080603f4774e -C extra-filename=-d850080603f4774e --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/serde_derive-d850080603f4774e -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name scopeguard /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/scopeguard-1.1.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=91afa33e60eb09b1 -C extra-filename=-91afa33e60eb09b1 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/serde-1.0.116/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="derive"' --cfg 'feature="serde_derive"' --cfg 'feature="std"' -C metadata=1a02cab7c16e427d -C extra-filename=-1a02cab7c16e427d --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/serde-1a02cab7c16e427d -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/maybe-uninit-2.0.0/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no -C metadata=9f94ee50e1295f1f -C extra-filename=-9f94ee50e1295f1f --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/maybe-uninit-9f94ee50e1295f1f -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name regex_syntax /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/regex-syntax-0.6.18/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="unicode"' --cfg 'feature="unicode-age"' --cfg 'feature="unicode-bool"' --cfg 'feature="unicode-case"' --cfg 'feature="unicode-gencat"' --cfg 'feature="unicode-perl"' --cfg 'feature="unicode-script"' --cfg 'feature="unicode-segment"' -C metadata=604baccf8464f333 -C extra-filename=-604baccf8464f333 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/ryu-1.0.5/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no -C metadata=a40cc9c191e07da8 -C extra-filename=-a40cc9c191e07da8 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/ryu-a40cc9c191e07da8 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/getrandom-0.1.15/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="std"' -C metadata=3134d02611660405 -C extra-filename=-3134d02611660405 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/getrandom-3134d02611660405 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/rayon-core-1.8.1/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no -C metadata=4f258883be84b941 -C extra-filename=-4f258883be84b941 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/rayon-core-4f258883be84b941 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/serde_json-1.0.57/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="std"' -C metadata=9c7f2a71de758875 -C extra-filename=-9c7f2a71de758875 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/serde_json-9c7f2a71de758875 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name smallvec --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/smallvec-1.4.2/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=af516ba081f6df94 -C extra-filename=-af516ba081f6df94 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name itoa /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/itoa-0.4.6/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=def6b42508610d1c -C extra-filename=-def6b42508610d1c --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/inventory-0.1.9/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no -C metadata=55eb92d7e72d18d1 -C extra-filename=-55eb92d7e72d18d1 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/inventory-55eb92d7e72d18d1 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name proc_macro_hack --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/proc-macro-hack-0.5.18/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type proc-macro --emit=dep-info,link -C prefer-dynamic -C opt-level=3 -Cembed-bitcode=no -C metadata=24f8c9a7698fc568 -C extra-filename=-24f8c9a7698fc568 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --extern proc_macro --cap-lints allow`
       Running `rustc --crate-name pkg_config /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/pkg-config-0.3.18/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=a729ffec8f42b1bf -C extra-filename=-a729ffec8f42b1bf --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name build_script_build /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/bitflags-1.2.1/build.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' -C metadata=86d2212697398c07 -C extra-filename=-86d2212697398c07 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/build/bitflags-86d2212697398c07 -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name cc --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/cc-1.0.60/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=bd7ffcf8ae7a9c20 -C extra-filename=-bd7ffcf8ae7a9c20 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
     Compiling unindent v0.1.6
     Compiling version_check v0.9.2
     Compiling ppv-lite86 v0.2.9
     Compiling number_prefix v0.3.0
     Compiling strsim v0.8.0
     Compiling vec_map v0.8.2
     Compiling ansi_term v0.11.0
     Compiling unicode_categories v0.1.1
       Running `rustc --crate-name unicode_width /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/unicode-width-0.1.8/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' -C metadata=2ffe7097d8c6b666 -C extra-filename=-2ffe7097d8c6b666 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name either /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/either-1.6.1/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="use_std"' -C metadata=644a45e467402f81 -C extra-filename=-644a45e467402f81 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name version_check /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/version_check-0.9.2/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=aa50462cc4c9df50 -C extra-filename=-aa50462cc4c9df50 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name unindent --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/unindent-0.1.6/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=fdeaf6996f560ff0 -C extra-filename=-fdeaf6996f560ff0 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name ppv_lite86 --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/ppv-lite86-0.2.9/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="simd"' --cfg 'feature="std"' -C metadata=e3e8e9d2c7899d24 -C extra-filename=-e3e8e9d2c7899d24 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name number_prefix /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/number_prefix-0.3.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="default"' --cfg 'feature="std"' -C metadata=a640ea83003307f7 -C extra-filename=-a640ea83003307f7 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name strsim /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/strsim-0.8.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=816b20067865d64c -C extra-filename=-816b20067865d64c --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name vec_map /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/vec_map-0.8.2/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=a7a30dfbdcea21f0 -C extra-filename=-a7a30dfbdcea21f0 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name ansi_term /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/ansi_term-0.11.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=9c09db9f9cbc7749 -C extra-filename=-9c09db9f9cbc7749 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
       Running `rustc --crate-name unicode_categories /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/unicode_categories-0.1.1/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=f5d72f9ccd926082 -C extra-filename=-f5d72f9ccd926082 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --cap-lints allow`
     Compiling lock_api v0.3.4
       Running `rustc --crate-name lock_api --edition=2018 /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/lock_api-0.3.4/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no --cfg 'feature="nightly"' -C metadata=54cc9296368f9d0e -C extra-filename=-54cc9296368f9d0e --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --extern scopeguard=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps/libscopeguard-91afa33e60eb09b1.rmeta --cap-lints allow`
     Compiling thread_local v1.0.1
       Running `rustc --crate-name thread_local /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/thread_local-1.0.1/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=44b3f6e675105288 -C extra-filename=-44b3f6e675105288 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --extern lazy_static=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps/liblazy_static-09f05f31cfc64306.rmeta --cap-lints allow`
     Compiling textwrap v0.11.0
       Running `rustc --crate-name textwrap /home/aarbelle/.cargo/registry/src/github.com-1ecc6299db9ec823/textwrap-0.11.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -Cembed-bitcode=no -C metadata=05dca2f2bb6ce7b5 -C extra-filename=-05dca2f2bb6ce7b5 --out-dir /tmp/pip-install-ohxny31i/tokenizers/target/release/deps -L dependency=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps --extern unicode_width=/tmp/pip-install-ohxny31i/tokenizers/target/release/deps/libunicode_width-2ffe7097d8c6b666.rmeta --cap-lints allow`
       Running `/tmp/pip-install-ohxny31i/tokenizers/target/release/build/serde_json-9c7f2a71de758875/build-script-build`
       Running `/tmp/pip-install-ohxny31i/tokenizers/target/release/build/rayon-core-4f258883be84b941/build-script-build`
  error: failed to run custom build command for `serde_json v1.0.57`

  Caused by:
    could not execute process `/tmp/pip-install-ohxny31i/tokenizers/target/release/build/serde_json-9c7f2a71de758875/build-script-build` (never executed)

  Caused by:
    No such file or directory (os error 2)
  warning: build failed, waiting for other jobs to finish...
  error: failed to run custom build command for `rayon-core v1.8.1`

  Caused by:
    could not execute process `/tmp/pip-install-ohxny31i/tokenizers/target/release/build/rayon-core-4f258883be84b941/build-script-build` (never executed)

  Caused by:
    No such file or directory (os error 2)
  warning: build failed, waiting for other jobs to finish...
  error: build failed
  /tmp/pip-build-env-7kdpvzfy/overlay/lib/python3.6/site-packages/setuptools/dist.py:452: UserWarning: Normalizing '0.8.1.rc2' to '0.8.1rc2'
    warnings.warn(tmpl.format(**locals()))
  cargo rustc --lib --manifest-path Cargo.toml --features pyo3/extension-module --release --verbose -- --crate-type cdylib
  error: cargo failed with code: 101

  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly

@arbellea Please make an issue on the tokenizer page. https://github.com/huggingface/tokenizers

Was this page helpful?
0 / 5 - 0 ratings

Related issues

alphanlp picture alphanlp  路  3Comments

ereday picture ereday  路  3Comments

yspaik picture yspaik  路  3Comments

rsanjaykamath picture rsanjaykamath  路  3Comments

zhezhaoa picture zhezhaoa  路  3Comments