V2: yarn build crashes after 92%
yes
Execute yarn build on some nontrivial amount of docs (e.g. 400 md files)
yarn build does not crash and produces docs
This is an outcome of the failing command
Compiling
โ Client โโโโโโโโโโโโโโโโโโโโโโโโโ after chunk asset optimization (93%)
โ Server โโโโโโโโโโโโโโโโโโโโโโโโโ building (70%) 799/799 modules 0 active
<--- Last few GCs --->
[72094:0x1026a0000] 186455 ms: Mark-sweep 2030.6 (2050.9) -> 2030.4 (2051.4) MB, 1099.8 / 0.0 ms (average mu = 0.150, current mu = 0.157) allocation failure GC in old space requested
[72094:0x1026a0000] 188396 ms: Mark-sweep 2031.1 (2051.4) -> 2030.9 (2051.9) MB, 1725.5 / 0.0 ms (average mu = 0.128, current mu = 0.111) allocation failure GC in old space requested
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x100ea9f02]
Security context: 0x17c4d139a2f1 <JSObject>
1: /* anonymous */(aka /* anonymous */) [0x17c4618e27a9] [/<some location>/Projects/magic-script-typings/website/node_modules/webpack/lib/Stats.js:~568] [pc=0x2ec40ce06684](this=0x17c4754804d1 <undefined>,0x17c4086cc5c1 <ModuleReason map = 0x17c461a50d69>)
2: map [0x17c4d138c321](this=0x17c4b88c7f01 <JSArray[156333]>,0x17c4618e27a9 <JSFunction (sfi = 0...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
Writing Node.js report to file: report.20190913.100136.72094.0.001.json
Node.js report completed
1: 0x100075bd5 node::Abort() [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
2: 0x100076316 node::errors::TryCatchScope::~TryCatchScope() [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
3: 0x1001697d7 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
4: 0x10016976c v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
5: 0x1005480d5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
6: 0x1005491c3 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
7: 0x100546bc3 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
8: 0x10054487f v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
9: 0x10054f365 v8::internal::Heap::AllocateRawWithLightRetry(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
10: 0x10054f3df v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
11: 0x10051e10f v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
12: 0x1007cf111 v8::internal::Runtime_AllocateInTargetSpace(int, unsigned long*, v8::internal::Isolate*) [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
13: 0x100ea9f02 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/<some location>/.nvm/versions/node/v12.4.0/bin/node]
error Command failed with signal "SIGABRT".
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Any reproducible repo? Hard to reproduce
Sent from Mail for Windows 10
Repo to reproduce: https://github.com/leoz/docusaurus_crash
@endiliey
I have a feel it might be related to a famous "92% chunk asset optimization" issue described e.g. here: https://github.com/angular/angular-cli/issues/5775

Hey, @endiliey , anything else I could provide to help investigation? This issue is pretty nasty. I cannot use Docusaurus V2 at all because of that.
Can you try increasing ur memory limit ?
https://github.com/angular/angular-cli/issues/5618#issuecomment-450151214
"92% chunk asset optimization" is most likely related to webpack - terser plugin. (during uglification/minification)
From the log, its out of memory problem
Maybe related https://github.com/webpack-contrib/terser-webpack-plugin/issues/143
Quick hack/ workaround for you. Create a local docusaurus plugin and override webpack plugin to set minification to false
edit: does not work
@endiliey , did it actually work for you?
If I patch local copy of docusaurus and set "minimize" to false and "minimizer" to undefined, the crash is still there.
yep doesnt work.

this issue is indeed quite popular for webpack project with large no of files. Quick search on google shows it happened on nuxt, angular, electron, etc and there's no perfect solution im seeing other than increasing the memory allocated
https://github.com/webpack/webpack/issues/8431
https://github.com/webpack/webpack/issues/4550
i tried to increase memory limit to 16gb and it works
"scripts": {
"build": "node --max_old_space_size=16000 node_modules/@docusaurus/core/bin/docusaurus build",
},

Well, unfortunately, I cannot confirm it worked. I tried to increase the memory limit for node and it still crashed but later, on "Creating an optimized..." step. I used more docs though. Like 2x number of the docs I've posted on GitHub.
@leoz can you provide instruction(s) for https://github.com/leoz/docusaurus_crash? i want to investigate the situation
Hey @evilebottnawi , here you are:
git clone https://github.com/leoz/docusaurus_crash.git
cd docusaurus_crash/website/
yarn
yarn build
thanks, investigate
@leoz A new version of terser-webpack-plugin change logic for source map generation, previously we generate broken source maps, now they are correct, unfortunately it takes more memory, we can do nothing on our side unfortunately, terser consume a lot of memory when working with source maps and need improve terser to reduce memory usage. Solution only node --max_old_space_size=16000, maybe you can pick up less, also you can disable source map generation. There is no golden solution. In webpack@5 we will use new version source-map (WASM), so memory consume should be less.
In can pass "Client" configuration (with some hacks),
but it gets stuck on "Server" stage:
Server โโโโโโโโโโโโโโโโโโโโโโโโโ building (70%)
@leoz
try upgrading to alpha.25 and then bump memory limit
"scripts": {
"build": "node --max_old_space_size=16000 node_modules/@docusaurus/core/bin/docusaurus build",
},
I think that's the only solution as of now. I cant reproduce the crash anymore now after increasing memory limit.

You can also try to tweak node_modules/@docusaurus/core/lib/webpack/base.js and add this line.
output: {
futureEmitAssets: true,
}
Its an option to tells webpack to use the future version of asset emitting logic, which allows to free memory of Sources with the trade-off of disallowing reading asset content after emitting. This is webpack@5 default way of emitting asset
futureEmitAssets: true option did not make any difference.
However this command worked:
"build": "node --max_old_space_size=32000 node_modules/@docusaurus/core/bin/docusaurus build"
This is, of course, not a solution but rather a workaround. But it solves my problem for now :)
Thanks!
@leoz
do you mind trying to use https://github.com/themgoncalves/react-loadable-ssr-addon/pull/18
and patch your docusaurus node_modules/react-loadable-ssr-addon with it ?
Upon another fresh install & investigation, this shall still crash in alpha.27 although the memory usage is reduced significantly.

This is somehow related to clean-webpack-plugin use of webpack stats.toJson() object. I have submitted a PR for them https://github.com/johnagan/clean-webpack-plugin/pull/164
Manually patching the node_modules will make it not crash

Re-opening for now. Possible solution for us: remove clean-webpack-plugin and maybe implement our own ๐ Using plugin to remove build folder can be overkill for simple use case
This issue is completely gone with Docusaurus v2.0.0-alpha.29.
Great job, guys!
Thank you!
@leoz sure thing. Actually had to send two PRs to fix this issue (cos its dependency problem).
We can do better to even reduce the RAM usage, but that will require us to fork another webpack plugin we're depending on.
btw your amount of docs is very huge. ๐ข One of the doc is as big as 50kb x.x