Now that Parcel 2 is out of alpha, I thought I'd try adding it to esbuild's bundler performance benchmarks. However, Parcel 2 currently fails to build both of the benchmarks. This issue is about the JavaScript benchmark.
Here is how to reproduce the issue:
git clone https://github.com/evanw/esbuild
cd esbuild
make bench-three-parcel2
I expected Parcel 2 to be able to build my JavaScript benchmark. Parcel 1 can build the benchmark fine.
While the benchmark is a stress test for bundlers, it should also be somewhat representative of a real-world code base. The benchmark consists of ten copies of the Three.js library because that is roughly the size of Figma's frontend JavaScript code base. I haven't tried running Parcel 2 on Figma's frontend code.
Parcel 2 starts building for a while, but then starts generating errors that look like this:
@parcel/workers: Worker terminated due to reaching memory limit: JS heap out of memory
Error [ERR_WORKER_OUT_OF_MEMORY]: Worker terminated due to reaching memory limit: JS heap out of memory
at Worker.[kOnExit] (internal/worker.js:195:26)
at Worker.<computed>.onexit (internal/worker.js:141:20)
I haven't investigated why this happens. Since Parcel 1 is able to build this fine, it's possible that this is due to an issue with Parcel 2 using too much memory, and the best fix is to fix the high memory usage.
The usual way to work around problems like this is to pass --max-old-space-size to node. However, I couldn't do this easily because the processes running out of memory are child worker processes, and there doesn't appear to be a configuration option for Parcel 2 to increase the memory limit for child worker processes.
| Software | Version(s) |
| ---------------- | ---------- |
| Parcel | parcel 2.0.0-beta.1
| Node | node v12.16.2
| npm/Yarn | npm 6.14.5
| Operating System | macOS 10.14.6
Since Parcel 1 is able to build this fine, it's possible that this is due to an issue with Parcel 2 using too much memory
Parcel 1 didn't do scope hoisting by default, Parcel 2 does...
Here is how to reproduce the issue:
Builds successfully for me, peak memory was about 2,5gb
real 185.30
user 450.52
sys 55.30
du -h bench/three/parcel2/entry.parcel2.js*
5.8M bench/three/parcel2/entry.parcel2.js
18M bench/three/parcel2/entry.parcel2.js.map
rollup:
real 44.92
user 60.85
sys 4.63
webpack:
real 56.07
user 73.82
sys 4.59
esbuild:
real 0.60
user 2.06
sys 0.70
parcel1:
real 142.84
user 380.36
sys 33.81
fusebox:
real 57.07
user 70.48
sys 3.99
๐ฌ
child worker processes
These aren't processes but threads, so --max-old-space-size should apply automatically
Builds successfully for me, peak memory was about 2,5gb
That's strange. It always crashes with ERR_WORKER_OUT_OF_MEMORY for me.
These aren't processes but threads, so
--max-old-space-sizeshould apply automatically
Here's what happens when I try --max-old-space-size:
$ node --max-old-space-size=8192 ./node_modules/.bin/parcel build --no-autoinstall ...
Error [ERR_WORKER_INVALID_EXEC_ARGV]: Initiated Worker with invalid execArgv flags: --max-old-space-size=8192
at new Worker (internal/worker.js:134:13)
at ThreadsWorker.start (node_modules/@parcel/workers/lib/threads/ThreadsWorker.js:39:19)
at Worker.fork (node_modules/@parcel/workers/lib/Worker.js:80:23)
at WorkerFarm.startChild (node_modules/@parcel/workers/lib/WorkerFarm.js:183:12)
at WorkerFarm.startMaxWorkers (node_modules/@parcel/workers/lib/WorkerFarm.js:341:14)
at new WorkerFarm (node_modules/@parcel/workers/lib/WorkerFarm.js:120:10)
at createWorkerFarm (node_modules/@parcel/core/lib/Parcel.js:516:10)
at Parcel.init (node_modules/@parcel/core/lib/Parcel.js:186:196)
at async Parcel.run (node_modules/@parcel/core/lib/Parcel.js:247:7)
at async Command.run (node_modules/parcel/lib/cli.js:211:7) {
code: 'ERR_WORKER_INVALID_EXEC_ARGV'
}
I was able to get it to work if I change this line in packages/core/workers/src/Worker.js:
- v => !/^--(debug|inspect)/.test(v),
+ v => !/^--(debug|inspect|max-old-space-size=)/.test(v),
@mischnic Would you accept a PR to fix this? It's just a one-line change, so I'm happy to make a PR if it'd be helpful.
Sure ๐
Closing this because the PR with the workaround has landed.
Most helpful comment
I was able to get it to work if I change this line in
packages/core/workers/src/Worker.js: