When at least two levels of dependencies are present eg packageA -> packageB -> packageC, and are referenced by local path npm fails to find the reference to packageC and errors with eth following error:
npm ERR! code ENOENT
npm ERR! syscall open
npm ERR! path /home/chris/dev/npmCiLocalPath/packageC/node_modules/packageb/node_modules/packagea/package.json
npm ERR! errno -2
npm ERR! enoent ENOENT: no such file or directory, open '/home/chris/dev/npmCiLocalPath/packageC/node_modules/packageb/node_modules/packagea/package.json'
npm ERR! enoent This is related to npm not being able to find a file.
npm ERR! enoent
For local path dependencies eg file:../packageA
which are more than one level deep. Ie a local package references a local package which references a local package.
Fails to install nested dependency.
Reproducible in the following repo:
https://github.com/yahgwai/npm-ci-local-path
Run the following commands to reproduce:
Should not error during npm ci
Any ideas why this is happening? This means I'm unable to use npm ci
with local paths.
Maybe same as #496
I am also faced with this issue with npm i
, when the package-lock.json exists. I haven't dug into it yet but believe it's caused when npm creates the its lock file. The error npm WARN tar ENOENT: no such file or directory, open
happens after the lock is created.
npm verb correctMkdir /Users/b/.npm/_locks correctMkdir not in flight; initializing
npm verb lock using /Users/b/.npm/_locks/staging-3c447d0dd81b5f70.lock
npm timing audit submit Completed in 568ms
npm http fetch POST 200 https://registry.npmjs.org/-/npm/v1/security/audits/quick 568ms
npm timing audit body Completed in 1ms
npm WARN tar ENOENT: no such file or directory, open '/Users....
In my case all of the files listed in the errors actually exist. Maybe they can't be updated because the lock(?).
I can confirm the issue. Installing using npm install
works fine and creates a package-lock.json. Once the package-lock.json is there, repeated runs of npm install
will fail for packages that reference local packages with the above mentioned npm WARN tar ENOENT: ...
errors that point to files in the .staging folder. (only on the second level, so when: pckA -> pckB -> pckC)
Deleting the package-lock.json files is not a workaround as it will only work until the files are created again on the next run.
I found that when I use local dependency references that point to a tgz-file instead of the project folder, then npm is able to correctly resolve the dependency tree and the issue no longer appears.
So for the referenced packages, use npm pack
after building them. This will create a package archive in your package folder that you can reference similarly in other packages. E.g.:
"dependencies": {
"packageB": "file:../packageB/packageB-1.1.0.tgz"
}
The difference appears to be that when referencing local packages by folder, npm seems to just copy the entire package folder into the node-modules folder of the referencing package including the node-modules folder of the referenced package instead of flattening the references.
When referencing a package packed into a tgz, no references are included, thus npm will look into the package.json of referenced package and resolve them correctly.
Why this behaves like this I don't know. I have also not used npm ci
yet, but I will try that as well.
I'm running on node v10.18.0 & npm 6.13.6 (also tried npm 6.13.4)
EDIT: I tried with npm ci
which works as well when using the described above.
The problem here is that npm i/ci
is creating symlinks that the file system doesn't have the full context to follow.
In a system where the dependency graph goes packageA -> packageB -> packageC, npm creates links like this:
/packageA
/node_modules
/packageB <symlink to ../../packageB>
/packageB
/node_modules
/packageC <symlink to ../../../../packageC>
/packageC
If you were to cd
into packageA/node_modules/packageB/node_modules
, ls ../../../../packageC
fails for the same reason that npm can't find the file: It doesn't have the context to travel _backwards_ along A->B the symlink, it's traversing the file system from packageB/node_modules/
. However if you cd ../../../../packageC
it _does_ work because cd
has context about the A->B link.
In order to fix this, npm is going to need to compute symlink paths in the local dependencies of local dependencies based on the relative filesystem locations of the local packages.
The problem appears for me if the node_modules directory exists before running npm ci
. So removing it first manually helped a lot.
My package.json looks something like this:
...
"dependencies": {
"submodule1": "file:submodule1/es5/",
"submodule2": "file:submodule2/es5/"
},
...
My directory structure looks like this:
/project
/submodule1
/es5
/submodule2
/es5
My build script looks something like this:
rm -rf ./node_modules #this line seems to help
( cd ./submodule1 && bash ./build.sh )
( cd ./submodule2 && bash ./build.sh )
npm ci
Tried this several times now in two projects. If i will see different results in the future i will update this comment.
I have a similar issue, but not quite the same. I'm working in a monorepo where the setup is like this:
/project
/ApplicationA
/clientApp (create-react-app)
/Application[B]
/clientApp (create-react-app)
/Packages
/PackageA
/PackageB
(I'm aiming to share assets and other custom components between the Applications as well as within the packages in some cases)
In this particular setup I have the same dependency in both ApplicationA and in PackageA. They both reference PackageB that is, in this case a small library with assets (custom icons) that are referenced in both ApplicationA and PackageA.
I install with
"npm install ../Packages/PackageB --save-dev"
in PackageA. (Package A is using 1 asset exported from PackageB and builds a custom component used by ApplicationA)
"npm install ../../Packages/PackageB --save"
in ApplicationA. (ApplicationA is using most assets exported from PackageB)
Then I do "npm install ../../Packages/PackageA --save"
in ApplicationA.
Using npm pack and referencing the tarballs solves the issue, but that forces me to bump version number and run npm pack for each iteration. And it makes it harder to develop more complex packages. When installing a local package using the folder with no tarball create-react-app will hot reload the changes if I have a watch/compile script in PackageA that compiles to /dist on changes.
So is there any workaround? We're running into CI build errors because we get this error everyone once in a while. Rerunning the build fixes the issue most of the time.
Given the length of time this issue has been open, is it safe to assume this is not going to be fixed? I just came across this problem at the weekend.
I recently converted my project to use Yarn's workspaces as a way of working round this issue.
Most helpful comment
The problem here is that
npm i/ci
is creating symlinks that the file system doesn't have the full context to follow.In a system where the dependency graph goes packageA -> packageB -> packageC, npm creates links like this:
If you were to
cd
intopackageA/node_modules/packageB/node_modules
,ls ../../../../packageC
fails for the same reason that npm can't find the file: It doesn't have the context to travel _backwards_ along A->B the symlink, it's traversing the file system frompackageB/node_modules/
. However if youcd ../../../../packageC
it _does_ work becausecd
has context about the A->B link.In order to fix this, npm is going to need to compute symlink paths in the local dependencies of local dependencies based on the relative filesystem locations of the local packages.