Node: Performance regression when enumerating objects after many deletes/inserts

Created on 26 Feb 2020  路  6Comments  路  Source: nodejs/node

  • Version: 12.16.1
  • Platform: Linux; OSX
  • Subsystem: V8

Once we upgraded a service to node 12, we noticed a huge spike in CPU utilization. A client library was doing many inserts/deletes on an object over time.

Big thanks to @mmarchini who root caused and wrote the reproducible code.

What steps will reproduce the bug?

Inserting and deleting many properties on an Object ({}) will cause the issue.
The follow script reproduces the issue:

'use strict';

const foo = {};

let ms = 0;
const C = 8380000;
for (let i=0; i < C; i++) {
    const key = `foo-${i}`;
    const start = new Date();
    foo[key] = {};
    ms += new Date() - start;
    if (i % 175)
        delete foo[key];
}
console.log(`Inserted ${C} in ${ms}ms, average ${ms/C} per insert`);

const STEP = 5000;
for (let j=0; j < 3; j++) {
    ms = 0;
    for (let i=0; i < STEP; i++) {
        const key = `${j}-${i}`;
        const start = new Date();
        foo[key] = {};
        ms += new Date() - start;
        delete foo[key];
    }
    console.log(`Inserted ${STEP} in ${ms}ms, average ${ms/STEP}ms per insert`);
}

On node 12.x it results something like:

Inserted 8380000 in 8684ms, average 0.0010362768496420048 per insert
Inserted 5000 in 4ms, average 0.0008ms per insert
Inserted 5000 in 7483ms, average 1.4966ms per insert
Inserted 5000 in 25055ms, average 5.011ms per insert

While on Node 10.x it results something like:

Inserted 8380000 in 7436ms, average 0.0008873508353221957 per insert
Inserted 5000 in 1ms, average 0.0002ms per insert
Inserted 5000 in 14ms, average 0.0028ms per insert
Inserted 5000 in 6ms, average 0.0012ms per insert

(results from Darwin nfml-ghermetoX5J 18.7.0 Darwin Kernel Version 18.7.0: Thu Jun 20 18:42:21 PDT 2019; root:xnu-4903.270.47~4/RELEASE_X86_64 x86_64)

Additional information

After a lengthy investigation we could verify that V8 version 7.8.279.23 which ships with Node.js version 12.16.x has a regression that was introduced on V8 7.x. The issue is fixed on V8 8.x and was caused by a bitfield overflow after many deletes/inserts in an object.

The issue doesn't manifest when using a Map instead of a plain Object.

V8 Engine performance v12.x

Most helpful comment

Sounds like https://github.com/nodejs/node/pull/31957 is the PR that fixes this.

All 6 comments

Sounds like https://github.com/nodejs/node/pull/31957 is the PR that fixes this.

Same issue here, bisecting to the same original Node.js change for the fix (https://github.com/v8/v8/commit/f7771e5b0cc46ab75fe5291482a727b3f115dcba). The original performance regression bisects to https://github.com/v8/v8/commit/6621c2d8ec0738fc83bd43e42a3aa97b47e1a64a, first seen in v8 version 7.1.310, so that is consistent with all Node 12.x and 13.x versions having the bug, since Node 12.0.0 is a jump to v8 7.4.288 (Node 11.15.0 being on v8 7.0.276.38).

I agree with mmarchini (https://github.com/nodejs/node/pull/31957#issue-379897507) that porting back to Node 12.x's v8 would be great. I first worked around the immediate bug by replacing the very large POJO dicts with Maps, and Maps don't perform as well as Node 10 POJOs (or Node 12 POJOs for the first 8MM insert/deletes) for our workload.

The next release of Node.js 12.x is scheduled March 24th with an rc planned for the 17th. That should allow the patch to land in a 13.x release next week. The potential fix does not land cleanly on 12.x though and I've requested a backport

Oh, I've been tracking this down for the past few days. It regressed, it seems, as part of the V8 update for 12.16.0 (bisect pointed me to b4e8f0d, which is the first commit that compiles for that release).

I'll try out the patch 馃憤

Edit: I've tested https://github.com/nodejs/node/commit/3d894d0511bc76c2bd886077d1cb434d878527de against the commit before it and it does fix the OP example, but does not fix the example that I have, so there must something more to it. I'll try to create a reduced test case...


Edit 2: the below code seems to be consistently slower on [email protected] than it used to be on [email protected], but it's only 1-2%, so I'm not entirely sure it accounts for all the slowness I'm seeing:

'use strict';

const NUM_METHODS = 250;
const ITERATIONS = 10000;

const obj = {};

for (let i = 0; i < NUM_METHODS; ++i) {

    obj[`item${i}`] = {};
}

console.time();

for (let i = 0; i < ITERATIONS; ++i) {
    const res = Object.assign({}, obj);
}

console.timeEnd();

The above code produces roughly the same results before and after https://github.com/nodejs/node/commit/3d894d0511bc76c2bd886077d1cb434d878527de, so this points to another regression anyways?

@dominykas it sounds like a different regression. Do you mind opening a separate issue?

it looks like the fix landed on v12.16.2

Was this page helpful?
0 / 5 - 0 ratings

Related issues

cong88 picture cong88  路  3Comments

danielstaleiny picture danielstaleiny  路  3Comments

fanjunzhi picture fanjunzhi  路  3Comments

stevenvachon picture stevenvachon  路  3Comments

seishun picture seishun  路  3Comments