Typescript: TS 4.0 Variadic tuple types: type inference with optional element not working as expected

Created on 25 Jun 2020  ·  8Comments  ·  Source: microsoft/TypeScript


TypeScript Version: 4.0.0-dev.20200624 (Nightly)


Search Terms: variadic tuple optional element type inference

Code

type Foo<T extends string[]> = [...T, number?]

// These are the expected types
const v11: Foo<[]> = [] // [(number | undefined)?]
const v12: Foo<["bar"]> = ["bar"] // ["bar", (number | undefined)?]
const v13: Foo<["bar"]> = ["bar", 42] // ["bar", (number | undefined)?]

// inferring types with helper function produces different results
function foo<T extends string[]>(t: [...T, number?]): [...T, number?] { return t }
const v21 = foo([]) // (string | number | undefined)[]  //❓
const v22 = foo(["bar"]) // [(number | undefined)?]   // error,❓
const v23 = foo(["bar", 42]) // ["bar", (number | undefined)?] // ✅

Expected behavior:
The inferred types via foo helper function (v21 etc.) are the same as explicitly annotated types (v11 etc.).

I would expect the passed in array literal value in foo to be narrowed to a tuple type, as explained in the PR:

When the contextual type of an array literal is a tuple type, a tuple type is inferred for the array literal. The type [...T], where T is an array-like type parameter, can conveniently be used to indicate a preference for inference of tuple types

Actual behavior:
Either the resulting type is an array (v21) or does not take the variadic type into account (v22), despite explicit [...T, number?] return type for foo. Also v22 errors now.

I could live with v21, but shouldn't it be clear to the compiler in v22, that the resulting type is akin to ["bar", undefined?]?

Playground Link:
here

Related Issues:

Bug Fix Available

Most helpful comment

With #39281 we no longer make speculative inferences to variadic tuple types.

declare function foo1<T extends string[]>(t: [...T, number?]): T;
const v11 = foo1([]);  // string[]
const v12 = foo1(["bar"]);  // string[]
const v13 = foo1(["bar", 42]);  // string[]

declare function foo2<T extends string[] = []>(t: [...T, number?]): T;
const v21 = foo2([]);  // []
const v22 = foo2(["bar"]);  // [] and error
const v23 = foo2(["bar", 42]);  // [] and error

Use overloads to get the desired effect of an optional ending number element:

declare function foo3<T extends string[] = []>(t: [...T, number]): T;
declare function foo3<T extends string[] = []>(t: [...T]): T;

const v31 = foo3([]);  // []
const v32 = foo3(["bar"]);  // ["bar]
const v33 = foo3(["bar", 42]);  // ["bar"]
const v34 = foo3(["bar", "baz"]);  // ["bar", "baz"]

All 8 comments

In the v21 case we make no inferences for T. Thus, T defaults to the string[] constraint, and instantiating [...T, number?] with string[] for T produces (string | number | undefined)[]. We really should have inferred [] for T in this case since it would produce a compatible type (because the number? element can be omitted).

In the v22 case we infer [] for T and I think that's the best we can do. You could argue that we should have inferred ["bar"], but that would require inference to make decisions based on type checks (such as "bar" isn't assignable to number? so instead infer to ...T) which we generally don't do. Inference is supposed to determine the types _independent_ of type relationships, which are then checked in a following phase. It's sort of similar to the reasons we can't always infer to [...T, ...U] because we don't know where one tuple ends and the other one starts.

BTW, if you declare a default value for T, as in T extends string[] = [], the v21 example works as expected. But that shouldn't be necessary.

To be clear, the bug label refers only to the v21 case above. The v22 case is working as intended.

Thank you for the detailed explanation, very interesting! I got your point concerning v22, which makes sense, given how type inference is supposed to work.

Some more thoughts on this...

The general guiding principle I've used in variadic tuple type inference is to not make speculative inferences. For example, don't attempt to infer to [...T, number, ...U] because (a) it would require type relationship checks during inference (e.g. when inferring from [string, string, number, string]) and (b) it would often be speculative (e.g. when inferring from [number, number, number, number]). A key characteristic of speculative situations is that there is more than one variable length element in the inference target and thus it isn't clear how to group the source elements.

With that in mind, I'm now thinking that we shouldn't even attempt to infer to [...T, number?] because there is more than one variable length element in the inference target, and inferences are bound to be speculative. And if we attempt it, we'd minimally need to give such inferences lower priority than inferences to a single variable length element because the latter are more accurate. For example, consider

function foo<T extends unknown[]>(a: [...T], b: [...T, number?]): T;

foo([1, 2], [1, 2, 3]);  // Should infer [1, 2]
foo([1, 2, 3], [1, 2, 3]);  // Should infer [1, 2, 3]

We'd need to give higher priority to inferences from the first argument because they're not speculative.

Generally, I'd prefer avoiding speculative inferences, so I think the resolution to this issue is to simply make no inferences to [...T, number?]. I know that may not be the desired outcome, but I think it is the most consistent.

With #39281 we no longer make speculative inferences to variadic tuple types.

declare function foo1<T extends string[]>(t: [...T, number?]): T;
const v11 = foo1([]);  // string[]
const v12 = foo1(["bar"]);  // string[]
const v13 = foo1(["bar", 42]);  // string[]

declare function foo2<T extends string[] = []>(t: [...T, number?]): T;
const v21 = foo2([]);  // []
const v22 = foo2(["bar"]);  // [] and error
const v23 = foo2(["bar", 42]);  // [] and error

Use overloads to get the desired effect of an optional ending number element:

declare function foo3<T extends string[] = []>(t: [...T, number]): T;
declare function foo3<T extends string[] = []>(t: [...T]): T;

const v31 = foo3([]);  // []
const v32 = foo3(["bar"]);  // ["bar]
const v33 = foo3(["bar", 42]);  // ["bar"]
const v34 = foo3(["bar", "baz"]);  // ["bar", "baz"]

Sounds like a reasonable compromise between practicability and consistency.

Given following example:
I guess, the #39281 change also reports an error here instead of inferring (string | number | undefined)[]?

function foo<T extends string[]>(t: [...T, number?]): [...T, number?] { return t }

const v24 = foo([42, 42]) // no errors, with even more divergent input value
// v24 type: (string | number | undefined)[]

Playground

If I have understood correctly, same as above holds true: the compiler cannot infer T, takes the base constraint string[] and combines it with number?.

If I have understood correctly, same as above holds true: the compiler cannot infer T, takes the base constraint string[] and combines it with number?.

That's correct, when we make no inferences for a particular type parameter, we default to the constraint. Then tuple type normalization turns [...string[], number?] into (string | number | undefined)[]. Unfortunately, that latter transformation is sound only for reading (so some type safety is lost). Ideally we'd turn it into (string & (number | undefined))[] for writing, but we aren't equipped to reason about upper and lower bounds separately. You can avoid this issue by specifying a default [] for the type parameter which makes a very restrictive inference in the case where inference fails.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

dlaberge picture dlaberge  ·  3Comments

zhuravlikjb picture zhuravlikjb  ·  3Comments

weswigham picture weswigham  ·  3Comments

Antony-Jones picture Antony-Jones  ·  3Comments

uber5001 picture uber5001  ·  3Comments