Sdk: dart2js and spec disagree about numerics

Created on 4 Feb 2012  路  23Comments  路  Source: dart-lang/sdk

The following Dart code:

var x = 123;
print(x is! double);

will print TRUE in Dart VM and FALSE in JavaScript compiled from Frog.

The generated JavaScript code in question:

print((typeof(x) != 'number'));

doesn't match the intention of the original Dart code.

closed-not-planned web-dart2js

Most helpful comment

For anyone else who runs into issues with shifts with Dart2JS and DDC, there's a simple solution.

After doing any bitwise operation on a negative number the resulting number is an unsigned 32-bit integer. You can turn it into a signed 32-bit integer with .toSigned(32). Right shifts properly move the sign bit, but the sign in the JavaScript number gets discarded by Dart2JS and DDC. print(((-1 >> 1).toSigned(32) >> 1).toSigned(32)) yields -1 in both the VM (with and without big ints) and the web.

However, I do feel that this should be fixed. Since the VM is now moving to signed 64-bit integers, it would make sense to use at least signed 32-bit integers for bit operations on the web.

With some tests on Chrome and Edge, the result of ((-1) >> 1) << 0 (in JavaScript) preserves the sign bit, while ((-1) >> 1) >>> 0 (in JavaScript) (which is what Dart2JS and DDC emit) does not preserve the sign bit. Changing the latter to the former would give the web the same results that the VM would yield (at least when bit width isn't taken into account), and the web would be consistent with the behaviors JavaScript developers expect (-1 >> 1 yields -1 in JavaScript and 4294967295 in Dart compiled JavaScript).

This may become more of an issue in the future, so it would be great to know how much code depends on this weird behavior.

But anyways, I thought of some pros and cons.

Pros:

  • Unknown! I'd love to hear from someone who works on Dart2JS or DDC on why we have these differences!

Cons:

  • From a glance at code available on GitHub, most libraries use workarounds to get consistent behavior with bitwise operations on both the VM and web.
  • This problem has affected developers for years. (This is issue number 1533 out of 31972)
  • This problem has been one of the pain points from testing on Dartium and deploying to normal browsers. (Not as much as today thanks to DDC!)
  • Sharing code between the VM and the web is tricky. Since developers have Flutter, they'll notice these differences when they try to share logic between the two. A developer might run tests solely on the Dart VM for their common library because it's quicker than running tests in the browser, but they won't hit these weird bits until they use the library on the web.
  • Result differs from JavaScript, C, C++, Rust, Go, etc.

All 23 comments

_This comment was originally written by [email protected]_


See also issue #638 and lengthy discussions on the mailing list last year.

_Added Area-Frog, Triaged labels._

I'm generalizing this to tack the numeric discrepancies between the spec (and VM) and the Javascript based implementations.

Consider:

main() {
    int billion = 1000000000;
    int quintillion = billion * billion;
    int quintillionOne = quintillion + 1;
    print(quintillion == quintillionOne);
  }

It compiles without emitting a warning or error, and prints true, which clearly violates the principle of least astonishment.

WE may end up making allowances for JS in some way, but for now this is a bug, and we can track the issue here.


_Changed the title to: "Frog and spec disagree about numerics"._

_Removed Area-Frog label._
_Added Area-Dart2JS, FromAreaFrog labels._

_Removed FromAreaFrog label._
_Changed the title to: "dart2js and spec disagree about numerics"._

_This comment was originally written by la...@randompage.org_


We need arbitrary precision integers in our Dart program and have currently implemented our own integer class to get around this bug. Arbitrary precision makes sense for a modern programming language as it's would cause numeric overflows to be a thing of the past.

At any rate please provide some closure on if Dart's integers are arbitrary precision or not.

The spec says that integers are arbitrary precision. The open issue is how will dart2js accomplish this, with reasonable performance.

Issue #4478 has been merged into this issue.

_Added this to the M1 milestone._
_Removed Priority-Medium label._
_Added Priority-High label._

We do not plan to change the way we deal with ints in dart2js for M1. We will take a look at tackling issue #3814 and issue #4437, but general big int support is not something we're ready to tackle anytime soon.

It remains an issue so I'm certainly keeping the bug open.


_Set owner to @kasperl._
_Removed this from the M1 milestone._
_Added this to the Later milestone._
_Removed Priority-High label._
_Added Priority-Medium, Accepted labels._

cc @larsbak.

Issue #5827 has been merged into this issue.

_Removed this from the Later milestone._

_Added this to the Later milestone._

_This comment was originally written by [email protected]_


Issue #6627 has been merged into this issue.


cc @karlklose.

Issue #6627 has been merged into this issue.

Issue #6627 has been merged into this issue.

Issue #7488 has been merged into this issue.

For the record, I would like to note that merging bugs like 7488 and others with 1533 is fine as long as those other bugs are revisited when addressing this issue, as this issue seems to have largely become about whether dart2js will support bigints, and I don't think that that is the same as what issue #7488 is about, namely doubles without fractional parts being treated as ints by dart2js (and thus being legitimately used, for example, as indices into containers, while the VM will throw an exception).

I don't believe this is something we will change in dart2js. We need to map both int/double to Number to be able to generate code that performs well in JS engines.

We might want to be more explicit/upfront about these exceptions though. Either by making mention of them in the spec or in our documentation. Thoughts?

/cc @floitschG @kwalrath @gbracha

I'm new to Dart and I find the documentation extremely confusing. On the one hand it claims Dart has built in arbitrary precission integers, and this would actually be a plus for developers coming from other languages, something that might motivate them to learn Dart. On the other hand, this pad proves otherwise. In the end this pad shows a problem with numbers greater than JavaScript's MAX_SAFE_INTEGER.

Why is this discrepancy between "VM Dart" and "JS Dart" allowed to exist nowadays? Dart is Dart and it should behave the same in all platforms. Either remove big int support from the VM or add it to dart2js for the sake of sanity. It is not difficult to implement this in dart2js, just use JavaScript's Number type for safe integers and switch to a JS based big int implementation outside of that range. That way you could keep the current performance for safe integers.

@homer-jay try reaching out in one of these place: https://www.dartlang.org/community
In particular, I think this mailing list is a great place to ask questions like this: https://groups.google.com/a/dartlang.org/forum/#!forum/misc

For anyone else who runs into issues with shifts with Dart2JS and DDC, there's a simple solution.

After doing any bitwise operation on a negative number the resulting number is an unsigned 32-bit integer. You can turn it into a signed 32-bit integer with .toSigned(32). Right shifts properly move the sign bit, but the sign in the JavaScript number gets discarded by Dart2JS and DDC. print(((-1 >> 1).toSigned(32) >> 1).toSigned(32)) yields -1 in both the VM (with and without big ints) and the web.

However, I do feel that this should be fixed. Since the VM is now moving to signed 64-bit integers, it would make sense to use at least signed 32-bit integers for bit operations on the web.

With some tests on Chrome and Edge, the result of ((-1) >> 1) << 0 (in JavaScript) preserves the sign bit, while ((-1) >> 1) >>> 0 (in JavaScript) (which is what Dart2JS and DDC emit) does not preserve the sign bit. Changing the latter to the former would give the web the same results that the VM would yield (at least when bit width isn't taken into account), and the web would be consistent with the behaviors JavaScript developers expect (-1 >> 1 yields -1 in JavaScript and 4294967295 in Dart compiled JavaScript).

This may become more of an issue in the future, so it would be great to know how much code depends on this weird behavior.

But anyways, I thought of some pros and cons.

Pros:

  • Unknown! I'd love to hear from someone who works on Dart2JS or DDC on why we have these differences!

Cons:

  • From a glance at code available on GitHub, most libraries use workarounds to get consistent behavior with bitwise operations on both the VM and web.
  • This problem has affected developers for years. (This is issue number 1533 out of 31972)
  • This problem has been one of the pain points from testing on Dartium and deploying to normal browsers. (Not as much as today thanks to DDC!)
  • Sharing code between the VM and the web is tricky. Since developers have Flutter, they'll notice these differences when they try to share logic between the two. A developer might run tests solely on the Dart VM for their common library because it's quicker than running tests in the browser, but they won't hit these weird bits until they use the library on the web.
  • Result differs from JavaScript, C, C++, Rust, Go, etc.
Was this page helpful?
0 / 5 - 0 ratings