Node: Intl.NumberFormat disregards maximumSignificantDigits

Created on 6 Aug 2018  Â·  15Comments  Â·  Source: nodejs/node

  • Version: 10.7, 10.8 (possibly others, tested these two)
  • Platform: Darwin 15.6.0, Linux 4.17.10-1-ARCH
$ node
> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 3 }).format(1234500)
'1,234,500'
> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 3 }).format(1234560)
'1,234,560'
> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 1 }).format(1234560)
'1,234,560'

Previously (node v9.2, at least), this behaved correctly:

> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 3 }).format(1234500)
'1,230,000'
> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 3 }).format(1234560)
'1,230,000'
> new Intl.NumberFormat(undefined, { maximumSignificantDigits: 1 }).format(1234560)
'1,000,000'
i18n-api

All 15 comments

@nodejs/intl @nodejs/v8

These are results with some relevant Node.js and V8 (Windows 7 x64, common Node.js builds with small-icu):

new Intl.NumberFormat(undefined, { maximumSignificantDigits: 1 }).format(1234560)
Node.js 06.14.3 V8 5.1
'1,000,000'

Node.js 08.11.3 V8 6.2
'1,000,000'

Node.js 10.8.0 V8 6.7
'1,234,560'

Node.js 11.0.0.2018-08-01.nightly V8 6.8
'1,234,560'

Node.js 11.0.0.2018-07-14.v8-canary V8 6.9
'1,000,000'

Node.js 11.0.0.2018-08-01.v8-canary V8 7.0
'1,234,560'

undefined

  • thought 1: _it shouldn't matter_ but it is probably better to pin this to something such as the following, to remove uncertainty

new Intl.NumberFormat(['en-US'], { maximumSignificantDigits: 1 }).format(1234560)

  • thought 2: does this work in v8 (d8) directly? what is the behavior there?
  • thought 3 - 'there oughtta be a test for this'

@srl295

V8 version 6.9.0 (candidate)
d8> new Intl.NumberFormat(['en-US'], { maximumSignificantDigits: 1 }).format(1234560)
"1,000,000"

so right now i'm concerned about this

> new Intl.NumberFormat(['en'], { minimumSignificantDigits:1, maximumSignificantDigits: 3 }).resolvedOptions().maximumSignificantDigits
6

and 3 kinda ≠ 6

-p "new Intl.NumberFormat(['en-US'], { maximumSignificantDigits: 1 }).format(1234560)"
Node.js 06.14.3 V8 5.1
1,000,000
Node.js 08.11.3 V8 6.2
1,000,000
Node.js 10.8.0 V8 6.7
1,234,560
Node.js 11.0.0.2018-08-01.nightly V8 6.8
1,234,560
Node.js 11.0.0.2018-07-14.v8-canary V8 6.9
1,000,000
Node.js 11.0.0.2018-08-01.v8-canary V8 7.0
1,234,560
V8 version 7.0.167
d8> new Intl.NumberFormat(['en-US'], { maximumSignificantDigits: 1 }).format(1234560)
"1,000,000"
> process.versions.node
'11.0.0-v8-canary2018080644d0a70123'
> process.versions.v8
'7.0.158-node.1'
> new Intl.NumberFormat(['en-US'], { maximumSignificantDigits: 1 }).format(1234560)
'1,234,560'

v8 is broken too unless they've mitigated this.

here's a hackaround that will should work for any ICU version:

diff --git a/deps/v8/src/objects/intl-objects.cc b/deps/v8/src/objects/intl-objects.cc
index 60c3a0721b..adbb6b460e 100644
--- a/deps/v8/src/objects/intl-objects.cc
+++ b/deps/v8/src/objects/intl-objects.cc
@@ -274,20 +274,31 @@ void SetNumericSettings(Isolate* isolate, icu::DecimalFormat* number_format,
     number_format->setMaximumFractionDigits(digits);
   }

-  bool significant_digits_used = false;
+  int32_t mndig = -1;
+  int32_t mxdig = -1;
+  bool mndig_used = false;
+  bool mxdig_used = false;
   if (ExtractIntegerSetting(isolate, options, "minimumSignificantDigits",
-                            &digits)) {
-    number_format->setMinimumSignificantDigits(digits);
-    significant_digits_used = true;
+                            &mndig)) {
+    mndig_used = true;
   }

   if (ExtractIntegerSetting(isolate, options, "maximumSignificantDigits",
-                            &digits)) {
-    number_format->setMaximumSignificantDigits(digits);
-    significant_digits_used = true;
+                            &mxdig)) {
+    mxdig_used = true;
   }

-  number_format->setSignificantDigitsUsed(significant_digits_used);
+  // per https://unicode-org.atlassian.net/browse/ICU-20063
+  // the following call trashes the min/max settings. So call it first.
+  number_format->setSignificantDigitsUsed(mndig_used || mxdig_used);
+
+  // Now it's safe to call set min/max
+  if(mndig_used) {
+    number_format->setMinimumSignificantDigits(digits);
+  }
+  if(mxdig_used) {
+    number_format->setMaximumSignificantDigits(digits);
+  }

   number_format->setRoundingMode(icu::DecimalFormat::kRoundHalfUp);
 }

v8 does NOT have a mitigation for this that I could see… https://github.com/v8/v8/blob/master/src/objects/intl-objects.cc#L294 … so v8 is broken, depending on the ICU version used.

but :( looks like v8 put a hack into its ICU version — here is the patch

it might be better to just put this same patch into node's ICU for this specific version.

@jskeate @vsemozhetbyt I committed a workaround patch in
https://github.com/srl295/node/commit/b53db21a4817958d74a37b9029df689220398a78 if you add this file into node and rebuild, it should mitigate the problem. it only applies to ICU 62, in future versions this should be fixed.

node 10.11 still has this issue - the history is a little hard for me to read, other than that this issue is still marked Open. Can I please get a recap?

node -e "console.log(new Intl.NumberFormat('en-US', { maximumSignificantDigits: 4 }).format(10.001))"

expected: 10 got 10.001.

Chrome debugger 69.0.3497.100 (Official Build) (64-bit) gives the right answer - no idea if that's relevant. ICU's tracked issue looks to be marked resolved 3 days ago.

@JasonKleban OK, sorry. Looks like I did not open a PR for https://github.com/nodejs/node/compare/master...srl295:patch-icu-62-sigdig … and also #23715 brings in 63.1 which fixes this. let me see what I can do.

Was this page helpful?
0 / 5 - 0 ratings