_Section of the site where the content exists_
Affected URL(s):
_Concise explanation of what you found to be problematic_
With the introduction of pkg.exports
a module only exports the paths explicitly listed in pkg.exports
, any other path can no longer be required. Let's have a look at an example:
Node 12.16.3:
> require.resolve('uuid/dist/v1.js');
'/example-project/node_modules/uuid/dist/v1.js'
Node 14.2.0:
> require.resolve('uuid/dist/v1.js');
Uncaught:
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './dist/v1.js' is not defined by "exports" in /example-project/node_modules/uuid/package.json
So far, so good. The docs describe this behavior (although not super prominently):
Now only the defined subpath in "exports" can be imported by a consumer:
âŚ
While other subpaths will error:
âŚ
While this meets the expectations set out by the docs I stumbled upon package.json
no longer being exported:
> require.resolve('uuid/package.json');
Uncaught:
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './package.json' is not defined by "exports" in /example-project/node_modules/uuid/package.json
For whatever reason I wasn't assuming the documented rules to apply to package.json
itself since I considered it package metadata, not package entrypoints whose visibility a package author would be able to control.
This new behavior creates a couple of issues with tools/bundlers that rely on meta information from package.json
.
package.json
have to add the package.json
to the pkg.exports
field.package.json
gracefully, since it might very well be the fact that a given package doesn't need to export any bundler meta information (and otherwise almost all packages on npm that could ever be used in a react-native project would have to add package.json
to their exports).package.json
exports gracefully, I see the risk that many modules that currently rely on their package.json
simply being externally consumable without additional effort might suddenly behave in odd ways when the meta information from package.json
is no longer readable by bundlers (and no error is thrown).Examples where this issue already surfaced:
Now the question is how to move forward with this?
package.json
can no longer be resolved unless added to exports
. EDIT: Already done in https://github.com/nodejs/node/commit/1ffd182264dcf02e010aae3dc88406c2db9efcfb / Node.js v14.3.0package.json
and always export it.I had some discussion on slack with @ljharb and @wesleytodd but we didn't come to an ultimate conclusion yet đ¤ˇââď¸ .
cc @nodejs/modules-active-members
other option would be to consider adding an exception for package.json and always export it.
To me this seems like a great solution. Having the package metadata is an awesome ergonomic feature of the current setup, and having module authors explicitly have to opt in would be a huge burden across the community. To me it seems like we would need a concrete reason not to have this exception. Can anyone think of a reason to make this case?
@wesleytodd I think it just comes down to what is public/private still. People putting public data into a a package.json
for things like tools to consume isn't really an issue. People putting configuration data like secrets is more the concern. I imagine it would still be able to be censored if people re-wrote the import still.
However, I'm unclear on the privacy model here since the benefit seems largely to be around bundlers which wouldn't run with the same constraints since they are ahead of time tools and general thought to access things in a more permissive manner than 2 independent modules with mutual distrust. It seems the problem is in part that these tools are using APIs that make them act as the same level of trust as any other module and other packages when they upgrade are removing permissions to view the package.json data (even if by accident). I think the concrete discussion here is if people should have to opt-out of package.json to avoid an accident prone workflow which is the inverse of all other resources in the package.
A different option since there is a specific use case that seems to need this is to have a flag of some kind for these ahead of time tools. Either to require.resolve
which looks to be the cause of issues above, to node via CLI/ENV, or something else. I do think providing an exception would make things just work, but somewhat go against the privacy intent of "exports"
.
My main concern with exposing it via exports is that thereâs two options:
The first option would force every package to treat its metadata file as a public API. Some users eslint config reuses what I put into my package.json#eslint section? Well, itâs exported so how can I blame them.
The second option means that tools may not actually get the metadata when they think theyâre loading the metadata. It could be argued that it only affects âweirdâ packages but given a sufficiently large number of users that know about this âtrickâ, I can totally see people use it.
I think bundlers shouldnât use require (âload code for this environmentâ) to load metadata. So Iâd rather have a new API to load the package.json that belongs to a specifier/referrer combination, exposing logic we already have in the loader. That API could be used by bundlers etc to cleanly get the metadata without having to hijack require.
I think the primary issue is that there's no way besides require.resolve
to resolve the package.json for a package. Even path.join(require.resolve(pkg), 'package.json')
won't work because some packages might have their "main" resolve to a subdirectory.
Here's the recommended way to do this with ES modules:
import { readFileSync } from 'fs';
(async () => {
const pkgPath = await import.meta.resolve('pkg/')
console.log(pkgPath);
console.log(readFileSync(new URL('package.json', pkgPath)).toString());
})();
The above also simplifies with TLA of course.
Currently the above only executes with --experimental-import-meta-resolve
.
@nodejs/modules-active-members I think we should discuss unflagging this feature.
I also want to clarify that the problems I have seen in the wild were always just about require.resolve('pkg/package.json')
in order to then load that file from the filesystem. I didn't see anybody trying to directly require('pkg/package.json')
to really load the json data as a module.
@guybedford import.meta.resolve('pkg/')
would fail if the package didn't have a main/dot, wouldn't it? or, if the ./
was mapped to a different directory, like ./src
?
@ljharb no, the trailing /
is specially specified to allow resolving package boundaries.
@guybedford require.resolve('es-get-iterator/')
in latest node throws Package exports for '$PWD/node_modules/es-get-iterator' do not define a './' subpath
. I would expect import.meta.resolve
to behave identically, so the same capabilities exist in both CJS and ESM.
iow, import.meta.resolve
only solves for ESM, not CJS, so it's not a solution to this problem.
@ljharb yes, because trailing slashes in CommonJS still apply extension searching, which the ESM resolver does not, which is a fundamental difference between the module systems.
People putting configuration data like secrets is more the concern (@bmeck)
This is a not a concern. Using exports to hide secrets is not ever a reasonable solution anyway.
Make it non-configurable. ./package.json always has to map to ./package.json. (@jkrems)
This is what I was thinking as well. Seems perfectly reasonable to enforce this constraint.
The first option would force every package to treat its metadata file as a public API. (@jkrems)
It already is. This is not a change in the ecosystem as it is today. Every file is part of the public api, and needs to be treated as such. If projects choose not to strictly follow semver, that is a different issue.
Here's the recommended way to do this with ES modules: (@guybedford)
We can also use hacks around require
for this. The point is that the most ergonomic way is also popularly in use, so should be added as an exception to the exports
spec, even if there are other ways around it.
I didn't see anybody trying to directly require('pkg/package.json') to really load the json data as a module. (@ctavan)
I have seen this many places. Although I am not going to spend the time now collecting references since I don't think this should be the primary focus of the discussion, if it become a key point I am happy to dig for them.
I think allowing censorship is necessarily good and wouldn't feel comfortable with ./package.json
always mapping to ./package.json
would not seem to allow that. In particular, people do set environment variables in their ./package.json
when deploying things in various environments that support it. Environment variables might be able to be removed at runtime via process.env
but if the deployment does not have a writable filesystem they could not censor their package.json
. I am not really here to judge if this workflow is a good idea, just to note that it does present a concern for myself.
I think allowing censorship is necessarily good and wouldn't feel comfortable with ./package.json always mapping to ./package.json would not seem to allow that.
Making it more complicated for everyone seems to strongly outweigh this. I know the point of engines
is to help package authors have more explicit contracts with their users, but if this just breaks everyone's tooling it is a net negative to the community, especially for the package authors who now have to deal with this added complexity.
I am not really here to judge if this workflow is a good idea, just to note that it does present a concern for myself.
Do you have examples of this type of workflow? The app developer use case is not what I was considering at first, so maybe there is some common practice I have not seen like this. If so we would not want to break it. That said, I feel like the current state before exports
had no restrictions on this, so I am not sure how we would be making anything worse.
I'm confused about why this is a concern; if you put secrets in a place on the filesystem that the node user can access, your secrets are already exposed. exports
is not a security feature, as we discussed many times during its development.
I think the problem is more about making the package.json
file part of the public API of a package.
The goal of exports is to fully encapsulate the public API of a package in a way that allows sound analysis of execution, optimization, breaks etc etc.
Exposing the package.json
goes against this by making the properties of the package.json part of the public API.
There are many ways to access the package.json otherwise - you are not stopped from doing it, it just takes a little more code. Updating require.resolve
patterns to a fs.readFile
pattern is all it is.
Also note that this mostly applies far more to frameworks than libraries. Frameworks can at least take the effort to understand the problem here and fix the root cause I'd hope.
Exposing the package.json goes against this by making the properties of the package.json part of the public API.
I think the goal would be to explicitly document this fact (and codify it as part of the implementation). Just call it part of the public api, always and forever, and be done with it. And to be clear, adding exports
broke the existing behavior which was that all files in a package are part of their public api. So going back on one clearly good exception seems to be a more reasonable middle ground than breaking every tool which relies on this today.
Also note that this mostly applies far more to frameworks than libraries.
Not sure I understand the distinction here. I have libraries which load package.json
s to inspect them via require
.
Thatâs the problem - you canât update to a readFile pattern if you canât get the path to the file robustly, in CJS. Thatâs not possible right now for a package with exports, that doesnât include package.json, and whose dot/main either is set to false or points to a subdir.
This focus around package.json
seems incidental to me. As a user, I would very much like to be able to require()/import items from the file system, which is the most apparent & comprehensive truth to me.
That this is not longer possible if there is a pkg.exports seems like a very critical degredation of what I as a consumer of modules would hope & desire. If package.json exports do export something, fine, I'll take that, but I should continue to be able to require/import files that a package distributes. Including package.json.
I beg node to please adjust course & not hide the file system the moment an author declares a package.json exports
.
Thatâs the entire purpose of âexportsâ, and itâs a highly desired one - thatâs not something that weâre discussing here.
Thatâs the entire purpose of âexportsâ, and itâs a highly desired one - thatâs not something that weâre discussing here.
well where do we discuss it jordan, because it's a bad choice & confusing for everyone? there should be room to fallback into actual real resources if not defined in this new abstract package.json system node invented for itself.
i don't see why we shouldn't have both. it would solve this issue. it would allow people who have for a decade now required()'d resources continue to do so when their package authors miss this or that resource. i think the package consumers deserve more than they are getting with this "highly desired" system.
@rektide the full resources are still available at require('/absolute/path/to/package.json')
exports only provides a filtering when entering the package through the public interface, via require('pkg/subpath')
.
If the problem is how to resolve the package path without having a suitable subpath, this is what the trailing slash was designed to allow in the example provided at https://github.com/nodejs/node/issues/33460#issuecomment-630452758.
If you don't like change, don't adopt exports.
@ctavan you make a good point in https://github.com/nodejs/node/issues/33460#issuecomment-630454987. Perhaps one option could be to treat package.json
as an exception in require.resolve
ONLY (and not for require
), where on a PACKAGE_PATH_NOT_EXPORTED error an internal fallback resolution approach applies.
I would not want such a path implemented for import.meta.resolve
though.
Anything require.resolve
resolves must also be obtainable via require
, otherwise the entire thing doesn't make sense.
Having it just for require.resolve
would definitely be an inconsistency in the name of backwards compat practicality, yes.
To me the viable options are:
./package.json
to "exports" in order for tools to workimport.meta.resolve
with a trailing slash, only works in ESM, which forces option 1 anyways"./package.json": "./package.json"
an implicit part of exports
, forcing you to do something like "./package.json": false
to opt outpackage.json
(so that you can do something like readFile(path.join(packageDir('package'), 'package.json'))
(note: this would not have any semver/API implications on a package, and would mirror import.meta.resolve('package/')
's behavior)I understand why the third option (implicit) is not desirable. I do not understand, however, why we find either options 1 or 2 acceptable, and I think it's worth exploring option 4.
In the meantime, it would be ideal to update the documentation for "exports" to address this likely-common hazard.
Let me add some thoughts from the point of view of a heavily depended-upon npm package like uuid.
That package provides 4 named exports (none of them qualifying as a default export) and a typical user will only need one of them in their project. Until ESM became a thing, and with it a robust way of performing tree shaking for browser bundles, we recommended to our users to deep require the respective files like const uuidv4 = require('uuid/v4');
which effectively required the v4.js
file from the package root.
With widespread adoption of ESM in browser bundlers (which is where treeshaking matters), we decided to move away from this deep-require-API and instead started encouraging users to import { v4 as uuidv4 } from 'uuid';
.
Getting rid of the deep requires forced our users to adjust their code and resulted in lots bogus "bug" reports of users who apparently ignored our deprecation warnings (which even contained a link with exact instructions on how to upgrade).
So from the perspective of an author of a popular npm package it is indeed a very desireable feature to be able to restrict the public API to a minimum: Simply for the fact that you may get rid of bogus bug reports that exclusively result from unintended require-usage of some users.
Now regarding tooling-config in package.json
: I think it was probably a bad idea to put framework/tooling-specific configuration into package.json
in the first place. It would probably be much better if packages which need to expose framework-specific configuration would do so in an explicit manner, e.g. like explicitly adding ./reactnative.conf.json
, ./svelte.conf.json
, ./rollup.conf.json
to their exports
.
Unfortunately that's not how it went, package.json
has always been a "reliable" source in the sense that any tool could rely on its mere existence.
So I'm of two minds here: I think _in principle_ this is a nice opportunity for cleanup of package's public APIs. On the other hand this is likely to cause significant cost across the community.
I agree however that as a compromise @ljharb's option 4 is worth exploring. And in any case I totally think that the documentation should be more explicit (which is why I raised this issue in the first place).
FWIW copying an interesting suggestion by @SimenB from https://github.com/browserify/resolve/issues/222#issuecomment-630639258:
For the "I need to load package.json
" use case, why not do something like this:
const {sync: pkgUp} = require('pkg-up');
const packagePath = pkgUp({ cwd: require.resolve('uuid') });
https://github.com/sindresorhus/pkg-up
Could trivially be packaged up into a pkg-of-module
and published to npm if people want.
I guess it could be wrong if people have nested package.json
s, but that seems like an edge case we shouldn't care about đ
_Originally posted by @SimenB in https://github.com/browserify/resolve/issues/222#issuecomment-630639258_
Nested package.json
files should not be discarded, as they may become more prevalent once people realise that they can include a { "type": "module" }
package.json within a subdirectory to allow using .js extensions for ES modules within an otherwise CommonJS package.
When trying to get the packageâs root package.json, nested ones are irrelevant; but that they may exist highly complicates any workaround to discover the package dir reliably.
- make "./package.json": "./package.json" an implicit part of exports, forcing you to do something like "./package.json": false to opt out
But no opt out is my preference. If a package author doesnât want some part of their metadata public, move it to another file. I donât see why this is so contentious, and I have not seen a single real world case where hiding the package.json is reasonable.
I sympathize with the use case, but I also don't want to take away package authors' ability to define their public API as _not_ including package.json
if they wish; and the mental overhead of anyone using "exports"
to have to read about/know about this exception is bad UX. And since JSON files aren't import
-able, the current pattern doesn't translate well into ESM.
That said, I think there _should_ be a way to get this information easily enough, so if that means a new API, then so be it. import { getPackageMetadata } from 'module'
where getPackageMetadata('pkg')
returns the root package.json
for pkg
? Sure, and it could work identically in CommonJS as in ESM. How is this any different or better than just making an exception to "exports"
? In my mind, this is equivalent to fs.readFile
, in that "exports"
already explicitly allows any-file access when people aren't using require
or import
, so this is the same thing. Or if we can use resolve
somehow to achieve the same goal, that would be fine too. We could even create a new API, like import { resolvePackageRoot } from 'module'
, that would only look at the bare specifier (resolvePackageRoot('pkg')
) and throw on any strings that contained slashes; therefore it could also function identically between CommonJS and ESM.
@GeoffreyBooth either of those two APIs would indeed solve the issue without risking implicitly expanding a package's API.
I would not oppose a new api in the long run. That said, there is a clear downside to a new api which is not present in the proposal to forcibly include the package.json
in the package api:
It requires maintainers to migrate their tools.
This is a HUGE burden on the entire community. Things will be missed, issues will be filed, people will spend time migrating. All of this can be avoided by just keeping the existing behavior for package.json
files and let them always be required
.
Until there is a clear real world example of why a package author would ever have a reason to restrict access to the required package metadata, I am not sure why we would consider adding an api and all the associated work.
Maybe two functions:
// Get the relevant metadata for the given resolution.
// Could either reject or resolve to `null` for relative specifiers.
getPackageMetadata(specifier, referrer) -> Promise<PkgJSON>
// Look up package metadata that would be used to interpret the given URL,
// e.g. to look up how to interpret yoloscript files (.js).
getPackageScopeMetadata(resolved) -> Promise<PkgJSON>
These could be provided in userland but I think it would be worthwhile to expose in node itself. If it's in node, we'd likely return a copy of the metadata.
It requires maintainers to migrate their tools.
Tools that check package.json
for packages that use exports
are exceedingly likely to need to migrate one way or another. Node added a feature to the resolution, tools that don't change will diverge from what package authors may expect now. So... it almost feels like an upside when there's a clear error that tells users that the tool hasn't been updated to properly support one of their dependencies.
Tools that check package.json for packages that use exports are exceedingly likely to need to migrate one way or another.
There are many reasons to load a package.json, many of which should not have to change. I agree this one case, if you are loading the package.json
to decide how to resolve source files, is guarnted to require tool changes, but no need to break all tools just for the one use case, right?
Re the tools needing to update, two things:
There aren't _that_ many tools that would need to update. This isn't a case where we have millions of users needing to update their code. The various tools that do this now would need to change, but that's it. A few dozens or hundreds of projects, for a trivial change, and that's it. I understand it's annoying for those tool authors, but there are many tens or hundreds of thousands of package authors who _don't_ need this exception and shouldn't need to spend mental effort learning it.
Creating an exception for package.json
would enshrine what some on this thread consider a bad pattern, that of stuffing random configuration data into package.json
rather than in separate files like .babelrc
or wherever. We should perhaps make a new API that returns the path to the package root, e.g. ./node_modules/pkg/
, rather than the path to the package.json
or the package.json
contents, as then tools could use this package root path to potentially read files other than package.json
to find their metadata or configuration data.
I propose that it would be entirely valid to expect current package.json
readers to fail more gracefully when being denied that opportunity by require()
or require.resolve()
. If they have configuration that they might be interested in reading from a package, being denied that opportunity should equate to that configuration not being defined in said package.json
.
If, on the other hand, the package does include some configuration in package.json
or elsewhere that might be considered a part of its public API, those paths should be expected to be included in the "exports"
.
There aren't that many tools that would need to update.
This is an optimistic take IMO. Even if we assume that it is relatively small, it sill would take all the users to upgrade to versions which support the new behavior. This will be a long migration no matter which way you slice it.
Creating an exception for package.json would enshrine what some on this thread consider a bad pattern
The package.json
has had a special position, and will continue to for as long as I care to look into the future. We have "enshrined" the package.json
as the required place for packages to put the required metadata in may other ways, this will not make it any more enshrined than it already is.
Secondly, there is a big difference between "loading package metadata" and "loading random configuration data". The first is required for tooling, and works great today. The second is an unintended side effect, which has great ergonomic advantages, so package authors have decided to leverage this. In one fell swoop, this addition will take away both of these features. I agree that arbitrary additions to package.json
can have issues, but purism of avoiding an exception is pointeless when this feature gives so much value to the ecosystem. And loading package metadata is just a requirement, this should be something we codify/enshrine, as it is a very important tooling feature.
Haven't had a chance to read everything but wanted to note that package.json
not being included in exports is already documented
https://github.com/nodejs/node/blob/master/doc/api/esm.md#package-entry-points
One thing we could do as a start would be make a specialized error message.
oops, butter fingers. sorry for the close
For reference the documentation was updated in this commit: https://github.com/nodejs/node/commit/1ffd182264dcf02e010aae3dc88406c2db9efcfb
From my perspective as the reporter of this issue the relevant new part (not yet released on https://nodejs.org/dist/latest-v14.x/docs/api/esm.html#esm_package_entry_points, @MylesBorins I'm sorry I only checked the released docs, not master) covers the documentation part of this issue đ :
Warning: Introducing the
"exports"
field prevents consumers of a package
from using any entry points that are not defined, including thepackage.json
(e.g.require('your-package/package.json')
. This will likely be a breaking
change.
A special error message would have helped me personally since, as explained in the original issue description, I _thought_ I was well aware of the new exports
behavior but somehow considered package.json
meta information and assumed the exports restrictions wouldn't applyâŚ
Haha, I'm happy that I am not the author of any bundlers or tools that need information from package.json
. And I don't need to think how to resolve path to it with new export
behavior. As I said in one of upper issues: module authors will not include package.json
to exports.
@farwayer i have 250+ modules, and although only a few have "exports" so far, every single one of them will include package.json
in its exports
.
every single one of them will include package.json in its exports.
To me this is exactly the problem. If all responsible module authors have to add it, then what is the point of not making it baked in?
Note that I won't be doing it because I have to necessarily, but because I don't see the point of not exposing it.
@ljharb You are good module author, read nodejs docs and think about tools developers :wink: Many of other will not think about exporting it. package.json
was only one stable point. When exports
will be widespread we can't depends on it any more.
I like your idea about method for resolving package path. But I have another idea. What do think about adding new universal API for providing all package meta information? path
, type
, exports
from start, maybe name
, version
, description
and license
if available? It can be super useful for tools because this will allow to refuse direct reading package.json
file if no extra info is needed. And it can be extended in the future.
I think the best path forward is to always publish package.json as it includes important information about the current module. In that way, we wonât need to add new APIs to core, and most things would work as is.
(I just stumbled upon this, and it seems a pretty valid concern, I think we should change it).
Any chance of working with package managers to update the behavior of package.json
generators?
For example, when the interactive npm init
command is run, the automatically generated package.json
file would contain not only a main
entry, but also an exports
entry exposing the package metadata.
{
"name": "test",
"version": "1.0.0",
"description": "",
"main": "index.js",
"exports": {
".": "index.js",
"./package.json": "./package.json"
},
"scripts": {
"test": "npm test"
},
"author": "",
"license": "ISC"
}
Then there would be no magic whatsoever from Node.js side, yet the ecosystem would get their preferred defaults. Getting tooling to solve the problems of tooling sounds fair, right?
I think a solution where tools are allowed to assume that exports
includes an entry for ./package.json
would be less desirable. If tools can't be changed to use anything but require
for loading metadata, the fix is to make sure require
always returns the metadata. I don't think a situation where node considers one thing expected behavior (exports
explicitly lists possible specifiers and exporting package.json
is a choice) and ecosystem tools assume another behavior as expected (exports
always has to include ./package.json
) would help here.
If anything, making ./package.json
implicit would be better because it may allow tools to skip it when distributing packages. It definitely creates a gray area where a minimal web-ready distribution now has to decide if it wants to risk excluding package.json
because it will always technically be part of the exported files. Which is unfortunate but it may be a price worth paying to support existing node-focussed tools.
Any chance of working with package managers to update the behavior of package.json generators?
This is a long process. I have been working with folks at npm on some other updates, it just requires a fair bit of work to make progress on. I don't think relying on generators across the ecosystem updating to make this work.
I don't think a situation where node considers one thing ... and ecosystem tools assume another behavior
I am not clear on your meaning here. While I agree tools and node doing something different is really undesired. But I think the goal here is to have one clear defined behavior which includes a good way to access the package metadata. Is that what you mean in your second paragraph?
The second paragraph also talks about browser compatibility, which is a different thing which might warrant more description to understand the use case.
I've added the modules agenda label and included it in the list for the meeting tomorrow. This has come up late so it's at the end of the agenda so whether we get to it and how much progress is made will be time dependent.
Given the 14.3.0 release yesterday the documentation part of this issue is IMHO now sufficiently honored in https://nodejs.org/dist/latest-v14.x/docs/api/esm.html#esm_package_entry_points
Since the discussion now mostly centers around how to deal with exporting package.json
I have renamed the issue description accordingly.
So I'm not opposed to creating this package.json
exception, but I would prefer if we didn't because of the education concerns (makes "exports"
harder to learn and therefore slows adoption) and because of the potential for bugs (now there's one more code path in our loader, users might think that package.json
being available when they didn't export it is a bug, etc.). And if someday some _other_ tool reads the "exports"
exported paths, perhaps to generate an import map, is it supposed to include package.json
or not? This exception just makes the whole feature permanently more complicated, for what feels to me like a fairly small temporary gain of allowing a few tools to not need to upgrade.
Which makes me wonder, just how many tools are we talking about? If it's in the dozens or even the low hundreds, then I think a better solution is for those tools to just update to use a different method for getting this package metadata. I read through some of the linked issues to this one and apparently plenty of tools _haven't_ had this issue, like Webpack and Rollup. I think it's premature for Node to jump to a solution when we don't have a good sense of how widespread the problem is. It's not the case that all package authors need to add package.json
as an exportâit's only that tools looking for a /package.json
export need to use other means of finding it, like Webpack and Rollup already do.
This will break APM which, for better or worse, often has to patch non-public elements of a package.
package.json
is imported for every package we monkey-patch. We need the version which very often determines what needs to be patched in the package.
One example of a non-public interface requiring patching in order to maintain context is mongodb for which it is necessary to patch each of the mongodb/lib/core/topologies/
files: server.js
, replset.js
, and mongos.js
. If it is not possible to require and patch files that are not exported we will fail. There are many other examples I can list if it would be helpful.
@bmacnaughton is there any reason to do this by placing it in the module cache/through the module loader compared with using fs
to read these instead of package.json
? per https://github.com/nodejs/node/issues/33460#issuecomment-630452758 . Not attempting to state pro/con here, just trying to see if a fs solution would be adequate instead of module loading. In particular modules are generally treated as singletons and share state/mutations over time and are put into a cache/persistent storage.
(note that "using fs" still requires a way to reliably get the package.json path, which exports
has disrupted)
@ljharb yes, but the question for me is still if the file actually needs to be loaded through the module system vs just resolved.
for apm we need a reference to the live exports.
is there any reason to do this by placing it in the module cache/through the module loader compared with using fs to read these instead of package.json
it's not clear to me what you're asking (there is a lot to absorb in this subject area and i'm not as deeply steeped in it as you are), so i'll try to go through the possibilities as a way of exploring the range of solutions.
it's possible for APM to read package.json
with fs
, provided we know where package.json
is. we never mutate package.json
and it is a well defined format, so there is no need for us to acquire it via import
. but as the module system has already resolved, read, and converted the text to an object, it seems more straightforward to not duplicate the code. providing APM with the results of the resolution, reading, and parsing is optimal from the APM perspective.
is it possible for APM to read any file it needs to patch with fs
? yes, we can read the files but the implication here is (i think) that we patch source as opposed to the live exports. i believe that's possible but it's highly undesirable for two reasons that are top of mind. 1) any source changes, not just API changes, have the potential to break patching code and 2) multiple loaders working in this fashion would each make source changes even if the API didn't change.
here's pseudo-code (ignore errors, checks, etc.) for how our monkey-patching works now, with CJS:
function patchedRequire (name) {
mod = realRequire.call(name);
if (!builtinModule(name)) {
pkg = realRequire.call(`${name}/package.json`);
}
if (!patched.get(mod)) {
mod = realRequire(ourPatchingCode[name])(mod, {version: pkg.version);
}
if (realRequire.cache[path]) {
realRequire.cache[path].exports = mod;
}
return mod;
}
// the patching code generally wraps various functions/classes/constructors with
// code that maintains async context. certain modules, like http/https, are modified
// to create context for instrumentation. in order to do this with many packages,
// files other than published API entry points must be patched.
function ourPatchingCodeCassandra (mod, info) {
const version = info.version;
...
if (semver.satisfies(version, '>=3.3.0')) {
RequestExecution = requirePatch.relativeRequire('lib/request-execution.js');
PrepareHandler = requirePatch.relativeRequire('lib/prepare-handler.js');
patchSendOnConnection(RequestExecution.prototype, sendOnConnection, connection);
patchPrepareHandler(PrepareHandler);
} else {
...
}
I've played a bit with bmeck's node-apm-loader-example (and updated it to work with the current API for the resolve() hook and the defaultResolver(). As it is, it's possible to use it in a similar fashion for the top-level packages. But APM needs access to files that are not exported. And devsnek is right - APM needs access to the exports, not just the source file, in order to instrument modules reliably. An additional presumption is that whatever the resolve()
hook returns is what goes into the cache; if not, then some means of replacing the cache is necessary.
I think one interesting thing of note here is: APMs currently load package.json
via require
while node itself doesn't. Which means that APMs cause all the package.json
metadata to be parsed twice and retained in memory twice. I personally would love to see an official "get metadata" API that returns effective package metadata for a given specifier/context pair, as seen by node. But it would require that node retains the version field (which it currently doesn't).
@jkrems presumably it would require node to retain 100% of the package.json data, since any of it might be needed by tooling?
@ljharb - our APM instrumentation only uses the version at this point; it might need to know about the exports section for esm.
I think an important thing that I missed was the chaining. Currently package.json meta-data has no hook for loading but could be replaced for importing. This seems like we are missing some hook so I'd agree with @jkrems
@ljharb I think more general tools that need the entire package.json
wouldn't mind using a userland solution that does something similar but with more configuration options and a more verbose cache. I'm not sure the same answer has to apply fully to APMs and bundlers/linters/etc..
I think more general tools that need the entire package.json wouldn't mind using a userland solution that does something similar but with more configuration options and a more verbose cache.
I am writing such a tool (and have written others in the past), and I don't want any of these. I want simply what require does pre-exports. It works really well, is simple, and is well understood/documented. Any userland implementation will probably be less satisfactory in all three of these measures. Also, it seems like this is a regression in developer experience if what worked for years now requires new userland modules to get the same behavior.
But it would require that node retains the version field (which it currently doesn't).
for the purposes of APM it seems that keeping version
, type
, and exports
would be all that's needed. I think being able to inquire the version of package that is loaded would be a broader benefit. I understand APM is not a core use case but it would be great if this could be worked in.
This was discussed in today's meeting, and we were primarily looking at https://github.com/nodejs/modules/issues/516 as a solution to this problem more generally.
Would a feature like that resolve the primary concerns here?
From my perspective as the original reporter of this issue nodejs/modules#516 would provide a clean way of resolving the package rout. From inspecting the tools that I found to be affected it should be possible to use this new way of resolving in all of these modules.
However it would still require every tool that currently uses require.resolve()
to adopt this new API. So while we would not need all package authors to include package.json
into their exports
we would still require all affected tool authors to fix their tools, at least we have O(#tools) << O(#npm packages)
.
So from my perspective to answer whether that's an acceptable solution or not really boils down to whether we want to put this burden onto all tool authors or not (since I'm not an author of any such tool, I don't have a strong opinion here).
I've read a fair bit of the discussion after being hit by this behaviour today. Not sure which solution I'd favour, but thought I'd feedback that the thing I find disappointing is that this change was released in a minor version, despite causing behaviour that breaks many projects.
If the impact was not anticipated, could your integration tests be more representative of the ecosystem?
If the impact was anticipated, but considered not to be a bug as userland should not be relying on this behaviour anyway, could it have been communicated more effectively? All the publicity around ESM arriving in nodejs that I've seen focuses on what it adds, with not even the official announcement mentioning that there are other effects too.
Considering v12 is LTS, it's really surprising that a deliberate change introduced in a minor bump of the runtime breaks my code.
@wheresrhys to be fair, the addition of âexportsâ is almost always semver-major, and itâs the packages that add it, not node, that would break things.
Running a package on node 12.6.3 works, running it on 12.7.0 breaks. The only change is in nodejs.
While I agree it is nuanced as to where exactly in the interplay between package, runtime and consumer the responsibility lies, it's clear (because it's the only thing that changed) that at least some falls on nodejs' decision to release this change outside a major.
It cost me a couple of hours of headscratching to figure out what had gone wrong. Multiply that across the nodejs community and that's a lot of disruption due - at least in part - to this decision. I just hope lessons are learned in how you push out changes in non major releases in future as I think, in this instance, (with the benefit of hindsight :joy:) it was forseeable that it'd break things.
@wheresrhys which package? I assume itâs one that added âexportsâ?
FWIW there was concern raised before ESM was unflagged in Node.js 12 and it was discussed in the Release WG and TSC (both meetings were recorded) prior to going ahead. It was acknowledged by people involved with the modules team that there was risk that the change could result in an observable change in behaviour.
@richardlau Thanks for sharing. I suppose then I'd ask if the WG have clear criteria for assessing risks and deciding what to do about them? There are some comments in the thread above (I'm not sure if they are all by WG members) which suggest the scale of the risk may have been underestimated. Also the degree to which nodejs is responsible for mistakes/misconceptions by maintainers of the other parts of the ecosystem (compare this with the "don't break the web" ethos of TC39, W3C, WHATWG)
@richardlau Thanks for sharing. I suppose then I'd ask if the WG have clear criteria for assessing risks and deciding what to do about them? There are some comments in the thread above (I'm not sure if they are all by WG members) which suggest the scale of the risk may have been underestimated. Also the degree to which nodejs is responsible for mistakes/misconceptions by maintainers of the other parts of the ecosystem (compare this with the "don't break the web" ethos of TC39, W3C, WHATWG)
As far as the Release WG goes (can't speak for the TSC), the answer is no. Perhaps the risk was underestimated, but the resolution of the release meeting was that the unflagging was allowed on the proviso that if it did turn out to be disruptive to the ecosystem it could be reflagged. I'll stick this on the release agenda for next week's meeting for discussion -- @MylesBorins since you're in both the modules team and Release WG perhaps you can bring a recommendation as to what we do for LTS (12.x) in light of this issue?
My personal opinion is that there is no reason to reflag.
Yes this has been somewhat disruptive I think reflagging modules at this point would be substantially more disruptive
Also taking a quick look at Node-fetch. They introduced package.exports
in 3.0.0 as a breaking change. While the does mean that the package will work on 12.16 and not on 12.17 the package itself explicitly broke that interface in a semver major change. It can undo the breakage by explicitly exporting the package.json, which @wheresrhys has opened a pull-request to fix.
This is entirely inline with what we expected could be broken, and exactly the remediation process we figured could be followed. Yes it is breaking, but it is broken by the interface of the package in a way that can be fixed without any changes in Node.js core. This is, imho, no different than a package begining to use any other Semver-Minor addition to node core, it will break older versions of the same minor that do not have that feature.
@MylesBorins I agree with most of what you say. I'm not advocating for reflagging - I agree it'd be immensely disruptive - but I think there's still room for reflection as to whether nodejs should tread more carefully in future.
This is, imho, no different than a package begining to use any other Semver-Minor addition to node core, it will break older versions of the same minor that do not have that feature.
This is conceptually quite a complex statement. How responsible nodejs is for mistakes in understanding and implementation by package publishers is debatable (somewhere between 'not at all' and 'a little' I guess), but, having identified there was a risk, better communication about that would've helped.
As a package maintainer who did start to use "exports"
, what was not obvious to me was the existence of tools such as Facebook's Metro bundler that will fail if they can't resolve pkg/package.json
. It would have been useful (and possibly still is) for that to be better documented in the ESM docs.
In other words, even if I cannot conceive of a reason to need to have package.json exported, at least for any code that could end up in a front-end application, there absolutely _are_ tools that depend on it.
[...] the existence of tools such as Facebook's Metro bundler that will fail if they can't resolve
pkg/package.json
.
I tried to find an issue on their side but couldn't find one. Was this reported to metro already?
I tried to find an issue on their side but couldn't find one. Was this reported to metro already?
@jkrems you can check https://github.com/eemeli/make-plural/issues/15 for the relevant references.
Looks like this is the issue on the metro side (linked from make-plural
): https://github.com/facebook/metro/issues/535
As this is still on the module agenda and there is no action item for the release team I'm dropping the release agenda tag
I guess scanning node_modules
and adding missing ./package.json
entries to exports
is easy enough to automate... :trollface:
Dropping the label. @ljharb is going to work on this
Most helpful comment
I'm confused about why this is a concern; if you put secrets in a place on the filesystem that the node user can access, your secrets are already exposed.
exports
is not a security feature, as we discussed many times during its development.