When trying to run deno on Centos 7, it fails:
[maxim@maxim deno] $ target/release/deno tests/worker.js
target/release/deno: /lib64/libc.so.6: version `GLIBC_2.18' not found (required by target/release/deno)
Additional information:
First things first, Centos 7 is based on RHEL 7, as one may know. And, according to redhat.com:
In Red Hat Enterprise Linux 7, the glibc libraries (libc, libm, libpthread, NSS plug-ins, and others) are based on the glibc 2.17 release
Replacing system glibc is a very bad idea, basically because all binaries on the system are compatible with specific glibc version. So, that's not an option.
However, the RHEL 8 Beta is using glibc 2.28 (source), but Centos 8 will only be released when the stable version of RHEL 8 will be released (source), so updating Centos is not an option right now.
I have also tried installing glibc in a nonstandard location as described here. Unfortunately, when I try to run deno with that custom glibc, I'm getting:
[maxim@maxim deno]$ LD_LIBRARY_PATH=/opt/glibc-2.18/lib target/release/deno tests/worker.js
Segmentation fault (core dumped)
I've tried to debug it with lldb, gdb, abrt, but neither option worked for me. Probably, because I'm doing something wrong.
lldb:
[maxim@maxim deno]$ LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/glibc-2.18/lib lldb -- target/debug/deno tests/worker.js
lldb: relocation error: /opt/glibc-2.18/lib/libc.so.6: symbol _dl_find_dso_for_object, version GLIBC_PRIVATE not defined in file ld-linux-x86-64.so.2 with link time reference
gdb:
[maxim@maxim deno]$ LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/glibc-2.18/lib gdb target/release/deno tests/worker.js
gdb: relocation error: /opt/glibc-2.18/lib/libc.so.6: symbol _dl_find_dso_for_object, version GLIBC_PRIVATE not defined in file ld-linux-x86-64.so.2 with link time reference
And abrt won't detect any crashes, even though I've set OpenGPGCheck = no and ProcessUnpackaged = yes in /etc/abrt/abrt-action-save-package-data.conf, restarted abrtd and abrt-ccpp, and finally tried rebooting, nothing helped.
I hope, that I'm doing something wrong, because right now I think using custom glibc is the only option, if I don't want to containerize deno.
Also, as far as I understand, GLIBC_2.18 isn't a direct requirement of deno, but rather a requirement of some third-party.
I've found Chromium bug glibc dependency creeped up to 2.18 in M61, breaking EL7 support which seems relevant, but it was fixed a while ago.
I'm not really good in C or library symbols to find which dependency requires 2.18, so any help here would be really appreciated.
I hope that @ry also would agree that we should support Centos, and probably when this issue got fixed, we can run travis not only for Ubuntu 14.04 LTS (trusty) but also for Centos. Many thanks for your attention!
馃帀 Yay, I've found a workaround:
cp target/debug/deno build/linux/debian_sid_amd64-sysroot
sudo chroot build/linux/debian_sid_amd64-sysroot /deno
and the deno console should pop-up.
So it seems like deno and/or v8 uses debian sysroot during the build, which seems to have newer libraries, then in Centos 7, that's why build is passing, but deno won't run. I'll try to look at that further, but hope that workaround helps anyone.
Well, no luck so far. I've managed to disable debian sysroot and tried building with my own, with no luck. Something is wrong with gcc version, or some libraries.. Not sure yet.
Also tried to build it in ubuntu trusty sysroot, with no luck. Some libs are missing.
I'm considering it to be a bug in V8. As far as I understand, even if we are building V8 with newer libs and tools, it still should work under runtime with older libs.
There also might be a chance to get it working by statically linking with glibc, but I can't tell how to do it and I don't think that's a good idea at all because it will increase binary size.
@Maxim-Mazurok Sorry for the slow response. I've looked into this a bit and it does seem that we should be supporting glibc 2.17 since chromium does. This is not a bug in V8 - this is a bug in Deno's build process
https://cs.chromium.org/chromium/src/build/linux/sysroot_scripts/sysroot-creator.sh?l=300&rcl=7809ce719aea456591007cb3cd1fed5b572e632c
We do build with the chromium sysroot - but it's possible that it's an old version due to our caching on travis?
https://github.com/denoland/deno/blob/ed6aec9bf0822dfba32f8a7e1781c5ca67881f6b/.travis.yml#L25
Have you tried building from source? I suspect that will work.. (Maybe put use_sysroot=false into .gn before?)
But we should figure out what the actual issue is here.
Can you try to run these binaries:
https://tinyclouds.org/deno_x64_linux.gz
https://tinyclouds.org/deno_x64_linux_cargo.gz
I'm building from source 99% of my tries. I've tried pre-built binaries only once or twice. So, it's not about travis, but rather about building it locally for now. As a matter of fact, I've also disabled sccache, in case if it loads something pre-built with wrong libs.
Those binaries that you've suggested resulted in the same error.
I've tried use_sysroot=false before and build failed, here's the log: gist.
I've also tried to use more recent sysroot, build succeeded, but I had same glibc error upon running.
About sysroot-creator I saw that one and it seems like they are using nm -D --defined-only --with-symbol-versions "${libm_so}" | "${SCRIPT_DIR}/find_incompatible_glibc_symbols.py" >> "${math_h}" to detect wrong symbols and apply a hack to make it work.
I've tried a couple of times running similar checks on other libraries. It found:
__cxa_thread_atexit_impl@@GLIBC_2.18 and GLIBC_2.18@@GLIBC_2.18 in libpcprofile.sopthread_getattr_default_np@@GLIBC_2.18 in libpthread-2.27.soBy the way, my nm is 2.27-34.base.el7, and it fails with nm: unrecognized option '--with-symbol-versions' when I try to run it manually. I had to compile newer version and tried replacing nm with path to my new version in all places. The command was:
find . -type f -name *.so -print0 | xargs -0 nm -D --defined-only --with-symbol-versions "{}" ./build/linux/sysroot_scripts/find_incompatible_glibc_symbols.py`
Amazon Linux 1 (not 2) also seems having this problem. I tried the above 2 binaries, but these don't seem working
(in amazon linux 1)
[ec2-user@ip-172-31-255-61 test]$ wget https://tinyclouds.org/deno_x64_linux.gz
...
[ec2-user@ip-172-31-255-61 test]$ gunzip deno_x64_linux.gz
[ec2-user@ip-172-31-255-61 test]$ chmod +x deno_x64_linux
[ec2-user@ip-172-31-255-61 test]$ ./deno_x64_linux
./deno_x64_linux: /lib64/libc.so.6: version `GLIBC_2.18' not found (required by ./deno_x64_linux)
...
(note: AWS Lambda uses Amazon Linux 1 as its runtime environment. So this also seems blocking running deno in AWS Lambda).
I found a similar issue in rust repository https://github.com/rust-lang/rust/issues/57497 and it seems suggesting building binary in old distro to solve this problem.
Do you have to install glibc-static package? https://forums.aws.amazon.com/thread.jspa?threadID=271828
For AWS Lambda I wonder if it's possible to include this binary in the bundle?
@hayd Yes, I installed that package. It seems that in amazon linux 1, glibc packages are explicitly v2.17
$ LANG=en yum info glibc glibc-static
Loaded plugins: priorities, update-motd, upgrade-helper
Installed Packages
Name : glibc
Arch : x86_64
Version : 2.17
Release : 260.175.amzn1
...
Name : glibc-static
Arch : x86_64
Version : 2.17
Release : 260.175.amzn1
...
For AWS Lambda I wonder if it's possible to include this binary in the bundle?
I'm not sure how to do that... But Lambda allows the package size of 50MB (zipped) / 250MB (unzipped). Linux binary of deno is about 10MB(zipped) / 50MB(unzipped). If the extra binary is less than 40MB(zipped)/200MB(unzipped), I think that should be possible.
BTW I tried to build deno on amazon linux 1. The above use_sysroot = false flag worked and I successful got deno binary working in it.
[ec2-user@ip-172-31-255-117 deno]$ cat /etc/system-release
Amazon Linux AMI release 2017.03
[ec2-user@ip-172-31-255-117 deno]$ ./third_party/depot_tools/gn args target/release/ --list --short --overrides-only
cc_wrapper = "/home/ec2-user/deno/prebuilt/linux64/sccache"
clang_use_chrome_plugins = false
is_cfi = false
is_component_build = false
is_desktop_linux = false
is_official_build = true
proprietary_codecs = false
rust_treat_warnings_as_errors = true
rustc_wrapper = "/home/ec2-user/deno/prebuilt/linux64/sccache"
safe_browsing_mode = 0
symbol_level = 0
toolkit_views = false
treat_warnings_as_errors = true
use_aura = false
use_dbus = false
use_gio = false
use_glib = false
use_jumbo_build = true
use_ozone = false
use_sysroot = false
use_udev = false
v8_deprecation_warnings = false
v8_enable_gdbjit = false
v8_enable_i18n_support = false
v8_extra_library_files = []
v8_imminent_deprecation_warnings = false
v8_monolithic = false
v8_postmortem_support = true
v8_untrusted_code_mitigations = false
v8_use_external_startup_data = false
v8_use_snapshot = true
[ec2-user@ip-172-31-255-117 deno]$ ./target/release/deno version
deno: 0.4.0
v8: 7.6.53
typescript: 3.4.1
I put the binary here. https://github.com/kt3k/lambda-deno-runtime-wip/blob/master/deno_built_in_amznlinux
Having a binary without glic requirement should allow deploying an aws lambda function. Awesome! I'll have a play with this later in the week.
The above
use_sysroot = falseflag worked and I successful got deno binary working in it.
It didn't work for me https://github.com/denoland/deno/issues/1658#issuecomment-460746941
I'll try again
Building on Centos7 works for me now.
use_sysroot=false into .gn fileyum install glibc-staticRe-ran ./tools/setup.py and ./tools/build.py
Running the test suite does not succeed fully. Something failed in (what I think are) the benchmarks

Running ./tools/benchmark.py by itself also complains about GLIBC version (wants 2.18 I have 2.17)
Having a binary without glic requirement should allow deploying an aws lambda function.
I meant to say alpine linux, and had hoped it would just work 馃槅 I need to dig up the complicated multi-step build (as there's no gn for alpine, so that needs to be built). 馃
I went back to trying to fix up the multi-stage docker image (the objective is to build a binary from alpine without glibc):
Dockerfile: https://gist.github.com/hayd/4e4dbe867cb11d7b579b82711b7da27d
$ RUST_BACKTRACE=1 DENO_NO_BINARY_DOWNLOAD=1 DENO_BUILD_ARGS='clang_use_chrome_plugins=false treat_warnings_as_errors=false use_sysro
ot=false clang_base_path="/usr" use_glib=false' DENO_GN_PATH=gn cargo install --root .. --path .
...
error: failed to run custom build command for `deno v0.21.0 (/deno/core)`
Caused by:
process didn't exit successfully: `/deno/target/release/build/deno-383b52993dc34877/build-script-build` (exit code: 101)
--- stdout
cargo:rerun-if-env-changed=DENO_BUILD_PATH
cargo:rustc-env=GN_OUT_DIR=/deno/target/release
cargo:rustc-link-search=native=/deno/target/release/obj
no binary download
ERROR at //build/config/jumbo.gni:197:26: Unknown function.
in_source_tree = string_replace(rebase_path(f),
^-------------
See //build/config/jumbo.gni:255:3: whence it was called.
internal_jumbo_target(target_name) {
^-----------------------------------
See //v8/gni/v8.gni:151:3: whence it was called.
target(link_target_type, target_name) {
^--------------------------------------
See //BUILD.gn:54:1: whence it was called.
v8_source_set("libdeno") {
^-------------------------
release: Writing gn options to '/deno/target/release/args.gn'.
is_official_build=true
symbol_level=0
clang_use_chrome_plugins=false
treat_warnings_as_errors=false
use_sysroot=false
clang_base_path="/usr"
use_glib=false
gn gen /deno/target/release
--- stderr
thread 'main' panicked at 'assertion failed: status.success()', core/build.rs:94:9
stack backtrace:
0: std::sys_common::backtrace::print
1: std::panicking::default_hook::{{closure}}
2: std::panicking::default_hook
3: std::panicking::rust_panic_with_hook
4: std::panicking::begin_panic
5: build_script_build::gn::Build::run
6: build_script_build::main
7: std::rt::lang_start::{{closure}}
8: std::panicking::try::do_call
9: __rust_maybe_catch_panic
10: std::rt::lang_start_internal
11: main
12: <unknown>
warning: build failed, waiting for other jobs to finish...
error: failed to compile `deno_cli v0.21.0 (/deno/cli)`, intermediate artifacts can be found at `/deno/target`
What am I missing? 馃槵
Is this issue still a problem in Deno 0.28.1 ?
I think it is still a problem on Centos7. In a docker container:
docker run -it centos:7 /bin/bash
[root@32729b720b03 /]# curl -fsSL https://deno.land/x/install/install.sh | sh -s v0.28.1
######################################################################## 100.0%
Deno was installed successfully to /root/.local/bin/deno
Manually add the directory to your $HOME/.bash_profile (or similar)
export DENO_INSTALL="/root/.local"
export PATH="$DENO_INSTALL/bin:$PATH"
Run '/root/.local/bin/deno --help' to get started
[root@32729b720b03 /]# export DENO_INSTALL="/root/.local"
[root@32729b720b03 /]# export PATH="$DENO_INSTALL/bin:$PATH"
[root@32729b720b03 /]# deno
deno: /lib64/libc.so.6: version `GLIBC_2.18' not found (required by deno)
Good news: Centos 8 and Fedora 31 have been working for me.
I claim it would be solved by https://github.com/denoland/rusty_v8/issues/49
I never dreamed it wouldn't work on CentOS7.
This is a serious problem for deno...
Or can it be solved already?
@chromsh It's been a while since I've done it, but believe it needs to be built from source on a Centos7 machine, and then it should work. This is probably a job for an ambitious package maintainer :)
@chromsh I believe that it should work with CentOS 8, if you don't mind upgrading.
Thank you all!
@hayd you are my hero! that's what I wanted.
We can't update 7 to 8 easily, and it's a bit of a hassle to build it yourself :(
I'll watch this repository.
https://github.com/hayd/deno-lambda/
I hope centos7 users realize this and use the cool deno runtime!
Simple Dockerfile to build Deno on CentOS 7 https://gist.github.com/nodakai/bc0c80381cd0b787d8a5c65a1771ef5f
Not sure if this is a dumb question (not a Rust expert), but is it possible to statically compile Deno so it doesn't depend on external libs?
Most helpful comment
Having a binary without glic requirement should allow deploying an aws lambda function. Awesome! I'll have a play with this later in the week.