A bigint literal is a literal, but it does not seem usable in const enums.
const enum Foo {
Bar = 123n
}
Currently produces, on 3.9.0-beta:
const enum member initializers can only contain literal values and other computed enum values. ts(2474)
Bigint compile-time constants 馃檪
const enum Foo {
Bar = 123n
}
console.log(Foo.Bar);
Should compile to
console.log(123n);
And log 123n when run.
My suggestion meets these guidelines:
Would take a PR to improve the error message but I don't understand the use case very much. What kind of enums would be backed by bigints?
The use case would be the similar to using normal numbers in enums--just for scenarios where we're dealing with integers that don't fit into float64's. The place I ran into this is dealing with IPv6 addresses. For IPv4 addresses, I can have an enum of well-known masks like
const enum IPv4Masks {
MulticastMask = 0x00_ff_ff_ff,
// ...
}
But IPv6 addresses are 64-bit integers. So, because const enums don't support bigints, I'm unable to similarly do
const enum IPv6Masks {
MulticastMask= 0x00ff_ffff_ffff_ffff_ffff_ffff_ffff_ffffn,
// ...
}
You could also easily imagine large bitsets with >52 values.
const enum Permissions {
Read = 1n << 0n,
Write = 1n << 1n,
// ...
BeFancy = 1n << 58n,
}
I don't think there should be an error message at all here, since bigint literals are just another type of literal; restricting their use in const enums seems arbitrary.
Since Numbers and BigInts don't mix, I'd vote to enforce the whole enum to be consistent: either all literals are Numbers, or all literals are BigInt.
If there are no literals, but you'd like your enum to be BigInt-backed, you'd just add = 0n to the first item:
~ typescript
enum Foo {
CAT = 0n,
DOG, // 1n
OWL, // 2n
}
~
Currently enums don't restrict that their values are consistent, for example this is valid:
enum Foo {
Bar1 = 1,
Bar2 = 'hello',
Bar3 = true,
}
You could also easily imagine large bitsets with >52 values.
Actually, >31 values, since bitwise operators in JS convert Number operands to signed 32-bit which makes the 32nd bit effectively unusable. Being able to use BigInts would be very helpful for interop with native code.
Yes, supporting bigint allows more flags than 31 flags
[1<<0, 1<<31, 1<<32] // [1, -2147483648, 1]
[1n<<0n, 1n<<31n, 1n<<32n] // [1n, 2147483648n, 4294967296n]
Possible problem with bigint enums is ambiguous reverse mapping. For example, the following code will not work as expected:
enum A {
a = 1,
b = 1n,
}
// Compiled to:
// A[A["a"] = 1] = "a";
// A[A["b"] = 1n] = "b";
// Expected: a
// Received: b
console.log(A[A.a]);
Also, the expression A[A.b] should be an error, because TS doesn鈥檛 allow bigints as indices.
I think, two solutions exist:
Just like for string members:
A[A["a"] = 1] = "a";
A["b"] = 1n;
Pros:
Cons:
Also requires relaxing index type rules:
bigint enum A {
a = 1, // Error.
b = 1n,
}
bigint enum B {
a = 1n,
}
console.log(B[B.b]); // 'a'
Compiles to:
// ...
A[A["a"] = 1] = "a";
A[A["b"] = 1n] = "b";
// ...
B[B["a"] = 1n] = "a";
// ...
console.log(B[B.b]);
Pros:
Cons:
number enum, string enum.Const enums are probably easier to implement as they don鈥檛 require reverse mappings. All we need is to allow BigIntLiterals in places where NumberLiterals are allowed, and to evaluate evaluate bigint expressions the same way as numeric, plus few additional checks.
Pros:
Cons:
Questions:
Also, reverse mapping for bigint members probably aren鈥檛 critical. These members most probably will be used for bit flags, where reverse mappings may not work even with regular numbers:
enum A {
a = 0b01,
b = 0b10,
}
console.log(A[A.a | A.b]); // Undefined with no error.
Here is a possible implementation of bigint const enums (assuming TSC host runtime supports bigints): https://github.com/miyaokamarina/TypeScript/commit/1fbcc8e226697c648242aaf6daee75244fe0a9fb
@miyaokamarina, I'd prefer just to have an error if bigint and non-bigint literals were mixed in an enum.
@yseymour, I don't understand why this should be an error. Both const and non-const enums already allow mixed member types. Would it be an error, if string and bigint members mixed? I think, this behavior will be inconsistent with current enums behavior.
Enums and especially const enums are underspecified after all, but I don't think this is a reason to add more strange behaviors. If enums will allow to explicitly specify underlying type, there will be no problem to warn about mixed types, but currently they don't.
For example, underlying types may be specified either using leading keywords (number enum, bigint const enum), or using the extends clause (enum E extends string, enum F extends bigint). But I think that's much more major change than just allow bigint literals in const enums.
Most helpful comment
The use case would be the similar to using normal
numbersin enums--just for scenarios where we're dealing with integers that don't fit into float64's. The place I ran into this is dealing with IPv6 addresses. For IPv4 addresses, I can have an enum of well-known masks likeBut IPv6 addresses are 64-bit integers. So, because const enums don't support bigints, I'm unable to similarly do
You could also easily imagine large bitsets with >52 values.
I don't think there should be an error message at all here, since bigint literals are just another type of literal; restricting their use in const enums seems arbitrary.