Please mention your node.js, mongoose and MongoDB version.
Node v9.2.0 , mongoose 5.0.6, MongoDb v3.0.6
hello, anybody, I have 200000 items in my mongodb collection, when i query the datas Item.find().where('couponEffectiveEndTime').gte(currentTimestamp)
the app waits about 5 seconds and get the error below:
`<--- Last few GCs --->
[27140:000002855B850110] 59570 ms: Mark-sweep 1410.5 (1471.7) -> 1410.5 (1440.7) MB, 1233.9 / 0.0 ms last resort GC in old space requested
[27140:000002855B850110] 60624 ms: Mark-sweep 1410.5 (1440.7) -> 1410.5 (1440.7) MB, 1053.2 / 0.1 ms last resort GC in old space requested
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0000034521EA5749 <JSObject>
1: completeMany [D:\workspace\GuangZi\Alimama\node_modules\mongoose\lib\query.js:~1433] [pc=0000003E602C1E2B](this=0000032B2708C0F9 <JSGlobal Object>,model=00000133
3F4923D9 <JSFunction model (sfi = 0000014659040641)>,docs=000000F4A4F63BF1 <JSArray[189462]>,fields=000000F4A4F64339 <Object map = 0000026F7F8823B9>,userProvidedFields=
000000F4A4F64371 <Object map = 0000026F7F8823B9>,pop=000003D...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
Why does this happen?
Hi @Bangood,
A nodejs process only allocates ram up to a set limit, once you hit that limit node exits. Each document you find from the collection increases the amount of memory consumed. Assuming that your query is finding thousands (potentially 10s or 100s of thousands of documents), there's a good chance you'll reach that limit.
Some potential strategies to avoid this situation would be:
let me know if you have any questions. If you want to dig into the specifics of your problem further, please provide your schema, relevant queries, sample data, and anything else necessary to replicate the issue.
@lineus explained it well, but yes i would recommend using a cursor or doing a paginated/batched query
Version 5.0.16 and 5.0.17 work fine but version 5.0.18 triggers this issue on my project
@javiercbk can you open a new issue with a reproducible example that demonstrates the behavior?
@lineus I'll try generate a test case. In my case, a unit test is triggering this error so it should not be hard for me to generate something like that.
Most helpful comment
Hi @Bangood,
A nodejs process only allocates ram up to a set limit, once you hit that limit node exits. Each document you find from the collection increases the amount of memory consumed. Assuming that your query is finding thousands (potentially 10s or 100s of thousands of documents), there's a good chance you'll reach that limit.
Some potential strategies to avoid this situation would be:
let me know if you have any questions. If you want to dig into the specifics of your problem further, please provide your schema, relevant queries, sample data, and anything else necessary to replicate the issue.