TypeScript Version: 3.2.4, 3.3.0-dev.20190126
Search Terms: array, flat, concat, any, unsound
Code
const a: boolean[] = [[17], ["foo"]].flat(); // wrongly accepted
const b: boolean[] = Array.prototype.concat([17], [19], [21]); // wrongly accepted
Using tsc --strict -t esnext.
(What Iâm really trying to do: find a type safe way to concatenate an array of arrays. Itâd be nice if [].concat(...arrays) were accepted, which I think would be a consequence of #26976, but thatâs not the issue Iâm reporting here. arrays.flat() and Array.prototype.concat(...arrays) are accepted but apparently provide no type safety.)
Expected behavior: [[17], ["foo"]].flat() should be typed (number | string)[] (if itâs accepted at all), and Array.prototype.concat([17], [19], [21]) should be typed number[]. In both cases the assignment to boolean[] should be rejected.
Actual behavior: TypeScript infers any[] for both right hand sides (even with --noImplicitAny implied by --strict), and accepts the assignments to boolean[] with no complaints.
Playground Link: Playground doesnât support esnext, but hereâs the second line.
Related Issues: #19535 for concat (but that one may have more to do with the difficulty of typing concatâs weird behavior when non-arrays are passed?), #26976
Array.prototype.flat has several overloads. Since you're getting any[] on the RHS without an "implicit any" error, it appears you might be hitting this one:
https://github.com/Microsoft/TypeScript/blob/master/lib/lib.esnext.array.d.ts#L132
@fatcerberus Indeed, that explains the unsoundness problem with flat but not with concat. (Should I split this into separate bug reports?)
I think the concat case might be just because youâre calling it through Array.prototype and therefore it has no this from which it can infer the proper type.
Yes, I was right: Array.prototype is inherently typed as any[] so calling it that way gets a return type of any[] as well.

Calling it this way instead produces the expected type:
const b: boolean[] = [17].concat([19], [21]); // type error!
As for flat, I'm thinking it falls back on the catch-all any[] overload because it can't infer the single type parameter U from multiple places when the types differ in each place. Rather than infer U as a union of each type, it picks the type at the first inference site and just uses that. Once that happens, none of the overloads match except for the generic any[] version.
To illustrate the above point:
declare function f<T>(a: T, b: T, c: T): T;
/* at this point one might expect T to be inferred as string | number | boolean,
* but instead: */
f("pig", 812, true); // ERROR: 812 not assignable to type "pig"
/* ...it's inferred as string! */
I don't think that's a bug, just a limitation in how the type inference machinery works.
Array.prototypeis inherently typed asany[]
Iâm pretty new to TypeScript, but that seems like a bug to me.
Calling it this way instead produces the expected type:
const b: boolean[] = [17].concat([19], [21]); // type error!
Yeah but that doesnât help for my actual use case, which is Array.prototype.concat(...arrays).
(I suppose I could write arrays[0].concat(...arrays.slice(1)) or arrays.shift().concat(...arrays) or something, but, yikes.)
Once that happens, none of the overloads match except for the generic
any[]version.
Fine, but why is there a generic any[] version? (Especially one that takes it's depth argument optionally?)
I don't know what else you could type Array.prototype as, except maybe never[], but then you couldn't really use it for anything. Since concat is an instance method, it gets its return type from the type of the object you called it on, and in your case you called it on Array.prototype which is any[]. Trust me, it's not a bug.
Fine, but why is there a generic
any[]version [offlat]?
Because the alternative would be to produce a type error for the call in question (regardless of what it's being assigned to!) due to the aforementioned design limitation.
https://github.com/Microsoft/TypeScript/blob/master/lib/lib.es5.d.ts#L1355
Just in case it wasn't clear that the typing of Array.prototype is deliberate.
To play devil's advocate for a bit, I suppose one could make the case for a generic overload of concat such as:
concat<U>(...args: U[][]): (T | U)[]
but thanks to type inference that would be very easy to abuse and accidentally create not-at-all-typesafe code.
@fatcerberus A bug that one can explain as an unfortunate interaction between several possibly deliberate features may still a bug. There shouldn't be a fundamental reason that there's no way to safely type a simple array flattening operation. (Forgive me if I missed some way you might be speaking on behalf of the TypeScript team; I believe you're trying to be helpful but your GitHub profile doesn't show an official association.)
The whole point of using --strict is that I _want_ TypeScript to raise type errors about code it doesn't understand rather than giving up and using any. I understand that there will be limitations to both completeness and soundness because we're talking about JavaScript, but I still think it's legitimate to suggest that this oughtn't be one of them.
Could Array.prototype be typed as {}[]? Or some other new interface entirely? Or something that still resolves to any[] but throws errors in --strict? I don't know, but surely there's some solution that makes sense.
I'm not a TypeScript team member but I do know that soundness isn't really a design goal of TS (even with the strict options in play). So if a completely sound type system what you're ultimately looking for, I'd expect some pushback from the actual team members. Just as fair warning. :smile:
See Non-goals, here (specifically the 3rd bullet point):
https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals#non-goals
@fatcerberus Donât worry, like I said, I know what TypeScript is and isnât and I would accept a reasoned conclusion that we canât do better, in this caseâbut I still think itâs more likely than not that we can, in this case.
As further evidence that this ought to be worth at least considering, Flow correctly infers number[] for both Array.prototype.concat([17], [19], [21]) and [].concat([17], [19], [21]). (I understand that TypeScript is very different from Flow, etc., no need to start another tangent about that. Letâs use further comments to talk about the actual issue at hand.)
One option that would improve type safety would be to change the any[] overload of flat() to unknown[] and likewise for the typing of Array.prototype. This would force you to use a type assertion and prevent accidents like in the OP, but I donât know whether it would go against that âbalance between correctness and productivityâ; in my experience the design seems to deliberately avoid creating situations that force a cast and prefer to use any in these situations. BUT I admit the possibility that that might only be because unknown is relatively new. :smile:
On the other hand, there is a practical reason I can think of for the any defaults: with all âstrictâ flags turned off, TypeScript is a true superset of JavaScript; this much I can say with confidence is a deliberate part of the design. Using something other than any to fill gaps in the type system would force use of TS syntax (type assertions) in perfectly legal JS code and we could no longer claim to be a perfect superset.
I think a more comprehensive solution to this problem and others like it would be to have a flag (opt-in) that made any behave like unknown does today.
@fatcerberus Huh? TypeScript is supposed to be a superset of JavaScript _syntactically_, but it never pretends to be anything like a superset in the sense that youâve described. TypeScript rejects all manner of valid JavaScript code like 10 > "9" and [1].push("s") and var a = 1; a = "s" and var o = {}; o.x = 1 by default. (Is this really the conversation we need to be having? Right here and now? Whether a type checker should be making any effort to, likeâŚcheck typesâŚ? At all? Do you do this on every bug report?)
I could get into a big debate about the merits of gradual typing here, but I won't--I'll just point out that the technical side of this issue is the existence of // @ts-check: TypeScript can be used to type-check .js files:
https://www.typescriptlang.org/docs/handbook/type-checking-javascript-files.html
This is relevant because it relies entirely on type inference (or optional JSDoc comments, but not everybody has them): therefore, if there are known gaps in the type inference mechanism, they must be filled with any to avoid rejecting valid code with no available recourse--since you can't add type annotations or type assertions to a .js file and still have it be valid JS.
@fatcerberus I donât understand why you still think Iâm questioning the merits of gradual typing. By the way, three of the four snippets in my last comment (10 > "9", [1].push("s"), var a = 1; a = "s") are still type errors in .js files with // @ts-check, and clearly some kind of .js-specific exception has been made for var o = {}; o.x = 1.
Letâs please go back to the actual code that this issue is about.
TypeScript gives [[17], ["foo"]] the type (string[] | number[])[]. It seems to believe that (string[] | number[])[] is convertible to (string | number)[][], but it wonât do this automatically for the purpose of overload resolution of .flat(). In any event, this seems like a sufficiently weird corner case that it doesnât seem worth filling this gap with any. Itâs certainly at least as weird as [1].push("s").
As for Array.prototype, if that gap is important enough to fill with any, then perhaps itâs important enough to fill with a real interface?
Edge cases for flat and similar functions must be filled with any because if that code happens to be in a type-checked .js file, you donât have the option of casting the argument(s) or return value to the proper type (unless thereâs some weird JSDoc method that Iâm not aware of). Thatâs why I brought up @ts-check above.
I presume the rationale for the typing of Array.prototype is similar but itâs a different enough case that Iâll wait for someone else to weigh in on it.
With the above being said, it would be very nice if the compiler could catch the fallback overload being used (with noImplicitAny) and treat it as an âimplicit anyâ situation, but the fact that it canât do this is more âdesign limitationâ than âbugâ insofar as itâs impossible at the syntax level to declare something as âimplicit anyâ in a declaration file.
The distinction between bug and design limitation is important, as it effects how much effort is needed to fix it. One is just a code change, the other involves engineering work. :smiley:
@andersk I apologize if Iâve come off as patronizing; it isnât my intent. Itâs just that sometimes there are a lot of unforeseen issues in play (corner cases, etc.) and itâs hard to be sure both sides understand them all.
On the design point (and again, disclaimer: not on the TS team): because TypeScript advertises itself as having âoptional typingâ, I think the design as a whole favors type inference over explicit annotations, and in cases where type inference canât infer a meaningful type it generally prefers to step aside rather than actively getting in your way. The strict options do catch some of these, but a balance has to be struck between catching potential errors and being annoying. Sometimes the line is a bit blurry.
Edge cases for
flatand similar functions _must_ be filled withanybecause if that code happens to be in a type-checked.jsfile, you donât have the option of casting the argument(s) or return value to the proper type (unless thereâs some weird JSDoc method that Iâm not aware of). Thatâs why I brought up@ts-checkabove.
Like I just said, that logic doesnât apply to [1].push("s"), so I see no reason it should automatically apply to [[17], ["foo"]].flat(). That said, you absolutely do have that option in a .js file (in the unlikely event that you want to write this code at all):
// @ts-check
const a = [[17], ["foo"]];
/** @type {(number | string)[][]} */
const b = a;
b.flat() // inferred (string | number)[]
so it doesnât apply anyway.
(Please feel free to email me privately if youâd like to further lecture me with your ideas about TypeScriptâs goals and the merits of gradual typing and the difference between bugs and design limitations. I assure you, itâs a waste of comment space here.)
I'm really not trying to argue that this behavior is ideal, if that's what you think. Quite the opposite. All I'm saying is fixing it isn't necessarily trivial. For flat (as that's the really interesting case), my thought process goes something like this (and here I'm speaking only for myself, if I were to set about fixing it):
The ideal solution for flat specifically would be to keep the any[] overload in place (so that projects with --noImplicitAny turned off--which are more likely to rely on such fallbacks--don't suddenly break) but to treat it as an "implicit any" when --noImplicitAny is turned on. However, this solution can't be implemented as described without introducing a whole new implicitAny type to the language--which is a high bar to clear.
Failing that, a more short-term fix would be to remove the any[] overload entirely, forcing you to cast to the proper type in case of any gaps in type inference (I proposed something like this above) or provide the type parameter manually. Or else make the final overload return unknown[] or never[] which changes where you put the cast but in practice has the same effect. In either case I'm sure we can agree this would be a breaking change; while the example in the OP is a contrived corner case, I'm not convinced it won't break real use cases given the existence of type unions. While sure, you can add a cast to make it work, you can also cast to make the boolean[] assignment work, so I don't know how much that buys you in practice. There's also a practical concern on my part that this would simply be trading false negatives (silent acceptance of the wrong type leading to unsoundness) for false positives (type errors in otherwise correct code -> potentially wasted time trying to find the mistake--or just reflexively add a cast which is a bad habit to get into)--and therefore no net gain. Maybe I'm way off-base on this, but I've seen enough issues caused by "corner cases" in handling of generics and unions that I can't help thinking this "easier" change would break real-world code.
If there was an obvious easy fix for this, sure, I would be all over it. Array.prototype IMO is such a case: I would probably just re-type it as never[] (the same as an unannotated empty array literal) and call it a day. Doing so would prevent shenanigans like:
// all of these would be type errors
Array.prototype.push("foo");
Array.prototype.push("bar");
Array.prototype[2] = "baz";
at the cost of disabling use of Array.prototype.concat(...) without a cast, which is painful for the use case in the OP but arguably a fair tradeoff, especially since it improves soundness and prevents a lot of patent nonsense.
Putting any speculation on how TypeScript is designed aside, as a user of TypeScript I still care about this kind of thing. Fixes that look trivial on the surface, made carelessly, can break valid use cases in unforeseen ways, especially changes to typing in a language whose type system is Turing complete (which doesn't remotely apply in this case, but I needed an excuse to throw it in đ¸ so don't kill me please? đ)
Anyway, I've put in my 2c (there, um, may have been some a lot of inflation), so I'll duck out at this point and we can wait to see what the TS team thinks about this.
@andersk @fatcerberus any interest in writing a TL;DR of this thread?
@RyanCavanaugh The goalposts didnât move as a result of the discussion AFAIK, the issue is still as described in the OP.
The array element type should be unwrappable pretty easily:
type Elem<X> = X extends (infer Y)[] ? Y : X;
interface Array<T> {
flat(depth: 7): Elem<Elem<Elem<Elem<Elem<Elem<Elem<Elem<this>>>>>>>>[];
flat(depth: 6): Elem<Elem<Elem<Elem<Elem<Elem<Elem<this>>>>>>>[];
flat(depth: 5): Elem<Elem<Elem<Elem<Elem<Elem<this>>>>>>[];
flat(depth: 4): Elem<Elem<Elem<Elem<Elem<this>>>>>[];
flat(depth: 3): Elem<Elem<Elem<Elem<this>>>>[];
flat(depth: 2): Elem<Elem<Elem<this>>>[];
flat(depth?: 1): Elem<Elem<this>>[];
flat(depth: 0): Elem<this>[];
flat(depth?: number): any[];
}
If I do so then I get correct errors:
const x = [1, [[2]]];
const y: never = x.flat();
produces
Type '(1 | [2])[]' is not assignable to type 'never'.
I happened to end up with a similar problem when playing with Metaprogramming https://github.com/Microsoft/TypeScript/issues/14833#issuecomment-460820626
Ran into what I would consider a related issue involving flatMap over at #31033, and ended coming up with a solution that essentially does the same as what @shicks has suggested here (i.e. using a conditional type with inference).
Rolling this one into #36554
Most helpful comment
The array element type should be unwrappable pretty easily:
If I do so then I get correct errors:
produces