Powershell: convertto-json bigint is not properly serialized

Created on 22 Mar 2019  路  7Comments  路  Source: PowerShell/PowerShell

[bigint]1|ConvertTo-Json

results in

{
"IsPowerOfTwo": true,
"IsZero": false,
"IsOne": true,
"IsEven": false,
"Sign": 1
}

instead of 1
tested on 6.2.0 on linux and 5.1 on windows

Area-Cmdlets-Utility Issue-Discussion

Most helpful comment

Oracle java is the standard

I disagree. RFC-8259 is the the standard. How any language implements JSON should be based on that standard.

The relevant part of the RFC is here https://tools.ietf.org/html/rfc8259#section-6

Note that when such software is used, numbers that are integers and
are in the range [-(2^53)+1, (2^53)-1] are interoperable in the
sense that implementations will agree exactly on their numeric
values.

We already support Int64 conversion which is out of range for general interoperability. Int64.MaxValue 9223372036854775807 is greater than Number.MAX_SAFE_INTEGER 9007199254740991. There is no strict limit on this per the RFC, just a warning with regards to interoperability.

If PowerShell was written in Java, then interoperability with jdk12 would be critical. However, we are based on .NET Core so we are bound by their support and limitations. Thus, it is far more important for us to consider .NET Core 3+'s implementation than any other language for any area not explicitly restricted by RFC-8259.

All 7 comments

JavaScript implementations cannot read large integers >53 bytes precision, without loss of precision, so it looks like there's no clearly correct answer here, and the main choices are:

  • Pick what another system does and do that; Newtonsoft Json.Net encodes larger integers than JavaScript can read safely and risks compatibility problems, Twitter encodes large numbers as strings and cautions developers to expect that, Google V8 refuses to serialize BigInts saying it isn't supported.
  • Encode small values, but error for BigIntegers representing larger values. (Might be unpredictable)
  • Do what it does now, with that object encoding, which isn't useful, but is consistent - doesn't promise anything which it can't deliver.
  • Defer until such time as ECMAScript / Google have agreed ways of serialising BigInts to JSON, and do what they do.

Background links:

ECMA-404 JSON standard doesn't put a limit on how large a value can be serialized as an integer in JSON, but JavaScript is limited to reading 53 bytes precision so it can't read integers larger than that, and returns truncated values. Twitter works around this by using strings to encode its 64-bit numeric identifiers and cautions developers to expect that.

Chrome's JavaScript engine V8 gained support for BigInt as a data type in 2018 but cannot serialize them to JSON, with VM134:1 Uncaught TypeError: Do not know how to serialize a BigInt.

ECMAScript has an open issue for what to do about JSON serialization of BigInts, appears inconclusive, with lots of discussion about compatibility.

/cc @markekraus Have you thoughts?

I think JavaScript is one of thousands of use-cases for JSON and probably a much small minority among those employed in PowerShell. The primary use cases for JSON in PowerShell are serializing and desalinizing data for interaction with REST APIs or for consumption or storage as settings. I see no reason to deny serializing BigInt. There are possibly REST APIs that would depend on such things.

But, I'd be more interested in what the .NET Core 3.0 JSON implementation does with them. If we still plan to move from NewtonSoft to the .NET Core native implementation, it probably behooves us to do as they do.

If we still plan to move from NewtonSoft to the .NET Core native implementation

@markekraus If we don't still have an issue tracking this please open new issue. Thanks!

Oracle java is the standard

I disagree. RFC-8259 is the the standard. How any language implements JSON should be based on that standard.

The relevant part of the RFC is here https://tools.ietf.org/html/rfc8259#section-6

Note that when such software is used, numbers that are integers and
are in the range [-(2^53)+1, (2^53)-1] are interoperable in the
sense that implementations will agree exactly on their numeric
values.

We already support Int64 conversion which is out of range for general interoperability. Int64.MaxValue 9223372036854775807 is greater than Number.MAX_SAFE_INTEGER 9007199254740991. There is no strict limit on this per the RFC, just a warning with regards to interoperability.

If PowerShell was written in Java, then interoperability with jdk12 would be critical. However, we are based on .NET Core so we are bound by their support and limitations. Thus, it is far more important for us to consider .NET Core 3+'s implementation than any other language for any area not explicitly restricted by RFC-8259.

I would like to note that if we do end up converting BigInt to a JSON Number by, it will be a breaking change. It _could_ be implemented in a non-breaking manner as a configurable option. However, I suspect what most users would consider the default behavior would be to convert BigInt to JSON Number. Whether we wish to not make it option or what is the default behavior needs to be considered carefully.

This issue creates inconsistent behavior with desktop PowerShell 5 though. Since ConvertFrom-Json prioritizes BigInteger over Decimal in PSCore, you can't go from JSON back to JSON without breaking the result so long as BigInteger isn't serialized correctly. This doesn't repro in Windows PowerShell.

Windows PowerShell 5.1.18362.145:

PS C:\> "9223372036854775808" | ConvertFrom-Json | ConvertTo-Json
9223372036854775808

The intermediate type is interpreted as Decimal.

PSCore 7.0.0-rc.1

PS C:\> "9223372036854775808" | ConvertFrom-Json | ConvertTo-Json
{
  "IsPowerOfTwo": true,
  "IsZero": false,
  "IsOne": false,
  "IsEven": true,
  "Sign": 1
}

The intermediate type is interpreted as BigInteger.

Was this page helpful?
0 / 5 - 0 ratings