Running some tests with benchmark.js, I found it might be a good idea to redo MessageEmbed and probably a few other structures. Basically, as it is right now, we are setting the properties of instances of it in the constructor even if they are undefined -
This takes up more memory than necessary, and as it turns out is actually slower than checking the property exists.
CORRECTION: I can't math. It's 16% slower in the case that the property is undefined, as should be expected. Imo that could be worth the memory trade-off, but that's up to maintainers
If I haven't made some mistake in the tests, would it be worth looking through the structures and removing unneeded variable definitions?
The tests I ran:
(new require('benchmark').Suite())
.add('Baseline - Nothing Executed', () => {})
.add('Certain | Defined', {
fn:() => {
source = { type: 'rich' }
target = {}
target.type = source.type
fetched = target.type
},
minSamples: 100
})
.add('Certain | Undefined', {
fn: () => {
source = {}
target = {}
target.type = source.type
fetched = target.type
},
minSamples: 100
})
.add('Uncertain | Defined', {
fn:() => {
source = { type: 'rich' }
target = {}
if (source.type) target.type = source.type
fetched = target.type
},
minSamples: 100
})
.add('Uncertain | Undefined', {
fn: () => {
source = {}
target = {}
if (source.type) target.type = source.type
fetched = target.type
},
minSamples: 100
})
.on('complete', results => console.log(results.currentTarget.map(target => `${target}`).join('\n')))
.run({ async: true })```
Output:
Baseline - Nothing Executed x 509,272,837 ops/sec 卤1.25% (57 runs sampled)
Certain | Defined x 17,298,131 ops/sec 卤0.79% (154 runs sampled)
Certain | Undefined x 15,584,237 ops/sec 卤0.66% (153 runs sampled)
Uncertain | Defined x 14,881,882 ops/sec 卤1.02% (153 runs sampled)
Uncertain | Undefined x 15,003,022 ops/sec 卤0.71% (155 runs sampled)
I'm not seeing any notable improvement in those results.
17,298,131 / 14,881,882 * 100 - 100 = 16.2361789994%
Yup. I was just plain wrong, it's a fair bit slower, though the memory consumption seems potentially worth it in my eyes. It could be a client option
The V8 javascript engine optimizes objects which share the same shape. (objects which are determined to have the same properties) Adding properties after initialization leads to deoptimization, and eventually, slow hashmaps which the engine won't even try to optimize anymore.
The addition of a few undefined or null properties doesn't increase the size of instances significantly, but the lack of them causes a number of performance regressions.
You can see an example of how shape can impact performance here: https://jsperf.com/shape-loops/1

You can read about how Javascript optimizes objects of the same shape here: https://mathiasbynens.be/notes/shapes-ics
Most helpful comment
I'm not seeing any notable improvement in those results.