Node: buffer.toString() no longer accepts null as encoding

Created on 26 Jun 2017  Â·  10Comments  Â·  Source: nodejs/node

  • Version: v8.1.2
  • Platform: Darwin ibc-macbook 16.6.0 Darwin Kernel Version 16.6.0: Fri Apr 14 16:21:16 PDT 2017; root:xnu-3789.60.24~6/RELEASE_X86_64 x86_64
  • Subsystem:

According to the doc, buffer.toString([encoding[, start[, end]]]):

encoding The character encoding to decode to. Default: 'utf8'

In Node 4/5/6/7 this works:

buffer.toString(null, 0);
buffer.toString(undefined, 0);

However, in Node 8, null does not work and produces TypeError: Unknown encoding: null.

Example:

var buf = Buffer.from("hello", 'utf8');

buf.toString(null);
// => TypeError: Unknown encoding: null

Not sure why null is no longer considered "use the default value".

buffer

All 10 comments

It is most likely caused by the https://github.com/nodejs/node/pull/11120 and listed among the semver-major changes.

To be fair though, fixing this use case and the original use case (passing 0 as the first argument to toString()) detailed in the PR that made this change would be as simple as changing === undefined to == undefined.

Whether we'd want to support such a change is another question (e.g. would anyone complain about not accepting false for example?).

I understand the rationale given in #11120. However, shouldn't null be also valid (and not just undefined) for encoding?

null was never documented as a "valid" encoding, so I don't see any reason to support it. It's just more ground for confusion.

Another +1 from me for keeping the 8.x behavior. I did not review the original PR because I saw it already had lots of love - but I'm definitely in favor of the behavior change.

@vsemozhetbyt One small request … can you leave off the version-specific labels if the issue is the same on master? Otherwise it seems like the issue is only occuring on v8.x. :)

Closing as expected behavior.

Why is this the expected behavior? The signature of the method is:

buffer.toString([encoding[, start[, end]]])

So encoding is an optional argument. However, where is it written that passing encoding = undefined means "use the default value"? Is there any rule out there stating that undefined is the only way to mean that while null is invalid?

That's how default arguments work in JavaScript in general:

> function foo(arg = 5) { console.log(arg); }
> foo()
5
> foo(undefined)
5
> foo(null)
null

However, it might be a good idea to document that explicitly.

So encoding is an optional argument.

Per the signature, it's only optional if start and end aren't specified, otherwise it would be buffer.toString([encoding][, start[, end]]) or something like that.

Good and clarifying code snip. Thanks.

Per the signature, it's only optional if start and end aren't specified, otherwise it would be buffer.toString([encoding][, start[, end]]) or something like that.

Completely right. Thanks, I was wrong.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

dfahlander picture dfahlander  Â·  3Comments

ksushilmaurya picture ksushilmaurya  Â·  3Comments

filipesilvaa picture filipesilvaa  Â·  3Comments

danielstaleiny picture danielstaleiny  Â·  3Comments

sandeepks1 picture sandeepks1  Â·  3Comments