Node: Problem with buffers after upgrade from v7.5.0 to v10.7.0

Created on 24 Jul 2018  路  9Comments  路  Source: nodejs/node

v10.7.0

// If length is more than 2 bytes, then write it's length integer into next two bytes, 7 bytes will be filled with "length code" 127 (7 bits + 2 bytes)
          else if(data_length<=self.max_8_byte) { // self.max_8_byte = 0xFFFFFFFFFFFFFFFF
            frame[1]+=127;
            var len_buff=new Buffer(8);
            len_buff.writeUIntBE(data_length,0,8);
            var frame=Buffer.concat([frame,len_buff]);
          }

Throws error:

RangeError [ERR_OUT_OF_RANGE]: The value of "byteLength" is out of range. It must be >= 1 and <= 6. Received 8
    at boundsError (internal/buffer.js:55:9)
    at Buffer.writeUIntBE (internal/buffer.js:588:3)

Same code works fine in v7.5.0. There were some big, breaking changes without backwards compatibility ?

buffer

Most helpful comment

@leimao You could do:

BigInt(`0x${buf.toString('hex', 0, 8)}`);

That's one of the fastest methods I found awhile back. Just be aware of possible endianness issues.

All 9 comments

Aha! It seems, no core support for unsigned 64 big endian integers.
For those who will have the same issue, here is working package:
https://www.npmjs.com/package/int64-buffer

The errors were introduced in node v10.0.0. For proper 64-bit value support, see https://github.com/nodejs/node/pull/19691.

That code indeed worked fine before v10.0.0, and apparently supported all safe uint values:

$ ~/tmp/nodejs/node-v6.14.1-linux-x64/bin/node 
> Number.MAX_SAFE_INTEGER
9007199254740991
> x = Buffer.alloc(8, 0x42); x
<Buffer 42 42 42 42 42 42 42 42>
> x.writeUIntBE(Number.MAX_SAFE_INTEGER, 0, 8); x
<Buffer 00 1f ff ff ff ff ff ff>
> x.readUIntBE(0, 8);
9007199254740991

The docs stated 芦Must satisfy: 0 < byteLength <= 6禄 though.

The former behavior was somewhat undefined and risky since almost all values above Number.MAX_SAFE_INTEGER would have been wrong. Soon, Node.js will hopefully support BigInt but otherwise, I can not see what could be improved here.

We could theoretically accept higher byteLength in case the passed in value is <= Number.MAX_SAFE_INTEGER. What do others think about that?

The former behavior was somewhat undefined and risky since almost all values above Number.MAX_SAFE_INTEGER would have been wrong.

Number.MAX_SAFE_INTEGER is 2**53, but the current cap that was introduced in code in v10.0.0 is 2**48 (though present in documentation before that), which is 32 times less.

Numbers between 2**48 and 2**53 were not risky, but (undocumented) support for writing them has been removed.

Note: I am not proposing to change that (yet), I have not thought about it much. I am just outlining things.

Given that this is documented, seems reasonable to me to close this out. Feel free to reopen if you disagree although I do think that should be accompanied by a PR if so.

Has there been any solution if I want to read binary bigInts, such as "readIntBE(0, 8)"?

@leimao You could do:

BigInt(`0x${buf.toString('hex', 0, 8)}`);

That's one of the fastest methods I found awhile back. Just be aware of possible endianness issues.

Hello mscdex,

Thanks for the quick response. The binary bigInts I am reading is from HBase. I also assume that the integer I got from HBase will not exceed the 32bit integer precision that JavaScript uses. At first I thought I could do

buff.readIntBE(2, 6)

But later I realize that this might only work for non-negative integers. I later found an package "'node-int64'" which might be very useful to achieve my goal.

var Int64 = require('node-int64');
var int64 = new Int64(buff);
var num = int64.toNumber(true);

If you have any further suggestions or recommendations, please let me know.

PS: I think your solution is more compatible with the future versions of NodeJS since it does not rely on external libraries and the precision of value is not likely limited to 32bit as converted from the node-in64 library.

Best,

Lei

Was this page helpful?
0 / 5 - 0 ratings