Hello!
First of all, thank you for this great tool and your awesome work!
However, I'm having a bug when updating my project:
It looks like npm update --depth=9999
command goes into some kind of a loop until it gets out of memory and the process is terminated with the error JavaScript heap out of memory
.
I'm not using any weird commands or combination of options. Actually this command is documented by npm. The excerpt from the official documentation:
As of [email protected], the npm update will only inspect top-level packages. Prior versions of npm would also recursively inspect all dependencies. To get the old behavior, use npm --depth 9999 update.
So, this command looks like the only way to update the entire dependency tree, and it's broken. This creates a situation where one of the fundamental functions of the npm (updating dependencies) is not working. I consider this a critical bug.
When I run npm update --depth=9999
in my project.
Sadly, I can't publish the repository due to it's proprietary nature. However, here's manifest, lock-file and the verbose update log output.
$ node --version
v12.18.3
$ npm --version
6.14.7
$ cat /etc/issue
Ubuntu 18.04.4 LTS
$ uname -a
Linux invader 5.4.0-42-generic #46~18.04.1-Ubuntu SMP Fri Jul 10 07:21:24 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
I'm also getting Maximum call stack size exceeded
when running this command in my private project.
I think that docs should probably be updated.
By the way, --depth 99
takes forever to complete, so I killed it.
OS: Windows 10
NPM: 6.14.7
Node: 14.7.0
@Maxim-Mazurok thank you for confirming this. What exactly do you think should be updated in the docs?
Well, if it's npm bug - then docs are fine.
But if it's somehow by design - then docs should correct 9999 to some reasonable number. Like 9, for example. 9 worked for me just fine.
I believe that npm should be able to update projects's entire dependency tree without consuming all of the RAM in the process. It does look like a bug in the update implementation or algorithm. If this example would be removed from the documentation then there wouldn't be a way to update ALL of the dependencies.
I guess you can update all of them by deleting lock file?...
@Maxim-Mazurok Deleting a lock file would be a security violation, because it contains some useful information like package hashes. If some package would be replaced in the registry (mutated) you won't notice it if you would delete the previous lock-file.
If you really want to - you can always check if any hashes changed manually using diff.
I have a concrete example project where this happens, hopefully simple enough (4 dependencies) to be debuggable:
https://github.com/gurdiga/repetitor.tsx/tree/660ab85/frontend/tests
FWIW, I鈥檓 seeing that it takes longer for npm update
to return as the --depth
increases:
npm --depth 13 update
~6snpm --depth 15 update
~9snpm --depth 20 update
~33snpm --depth 40 update
didn鈥檛 return after a few minutes and I ^C-id it.I hope this helps, and would like to do more to help solve this. 馃
UPDATE: It seems that npm update --depth 999
still hangs even if I only have the latest enzyme
package, so here are the simplest steps to reproduce:
mkdir npm-hang
cd npm-hang
npm init -y
npm add enzyme # this installs the 3.11.0 at this time
npm update --depth 999 # this hangs
Again, FWIW, I鈥檓 attaching both package.json
and package-lock.json
, just in case it can help shed some light on this issue:
Most helpful comment
I believe that npm should be able to update projects's entire dependency tree without consuming all of the RAM in the process. It does look like a bug in the update implementation or algorithm. If this example would be removed from the documentation then there wouldn't be a way to update ALL of the dependencies.