Typescript: Expression produces a union type that is too complex to represent

Created on 29 Aug 2019  Β·  4Comments  Β·  Source: microsoft/TypeScript


TypeScript Version: 3.6.2


Search Terms:

  • union type
  • partial

Repo

  1. git clone https://github.com/Microsoft/typescript-template-language-service-decorator
  2. npm install
  3. Open src/template-language-service-decorator.ts
  4. On line 44, remove the any cast on the line:
```ts
    for (const { name, wrapper } of this._wrappers) {
        (intercept[name] as any) = wrapper(languageService[name]!.bind(languageService));
    }
```
  1. Try compiling the project

Expected behavior:
This compiles with TS 3.4

Actual behavior:
Compile fails with TS 3.5+

src/template-language-service-decorator.ts:44:13 - error TS2590: Expression produces a union type that is too complex to represent.

44             intercept[name] = wrapper(languageService[name]!.bind(languageService));
               ~~~~~~~~~~~~~~~

The code likely needs to be rewritten here but I'm opening this issue to make sure the error is expected

Playground Link:

Related Issues:

Bug Performance

Most helpful comment

Righto, so name is a union of the 52 (! that API is big) keys in the language service object, while intercept is a Partial<LanguageService>, which means every property is a union of it's actual type (often a signature-containing object) and undefined. When we're assigning to this, we produce an intersection like - ((() => void) | undefined) & (((fileName: string) => DiagnosticWithLocation[]) | undefined) & (((fileName: string) => Diagnostic[]) | undefined) and so on, which we then normalize (so we distribute to lift the union to the outside). This normalization spreads the types, so you get something like (() => void) & undefined & undefined | (() => void) & ((fileName: string) => DiagnosticWithLocation[]) & undefined | (() => void) & ((fileName: string) => DiagnosticWithLocation[]) & ((fileName: string) => Diagnostic[]) and so on, enumerating every possible combination of the input types. Without any kind of simplification taken into account, for an intersection of 52 unions of 2 elements, we would need to produce 4.5e15 resulting top-level union elements. That's _way_ too many. We calculate that number in advance, see that it's greater than our cap of 100000, and issue the error in the OP.

Now, this specific case, when we're distributing a ton of unions, all of which contain undefined, I think we could preemptively simplify to greatly reduce that number, since undefined can't be subtyped under our current rules (undefined intersected with pretty much anything else is always never). I posited as much when we originally added the limit, and it looks like we do have reason to do so. It's probably worth noting that what I'm thinking of'll only help situations where every input is a union containing undefined (and maybe null), though. :S

All 4 comments

This error was added intentionally, so it’s likely to be by design, but we should take a look at the code to confirm.

Righto, so name is a union of the 52 (! that API is big) keys in the language service object, while intercept is a Partial<LanguageService>, which means every property is a union of it's actual type (often a signature-containing object) and undefined. When we're assigning to this, we produce an intersection like - ((() => void) | undefined) & (((fileName: string) => DiagnosticWithLocation[]) | undefined) & (((fileName: string) => Diagnostic[]) | undefined) and so on, which we then normalize (so we distribute to lift the union to the outside). This normalization spreads the types, so you get something like (() => void) & undefined & undefined | (() => void) & ((fileName: string) => DiagnosticWithLocation[]) & undefined | (() => void) & ((fileName: string) => DiagnosticWithLocation[]) & ((fileName: string) => Diagnostic[]) and so on, enumerating every possible combination of the input types. Without any kind of simplification taken into account, for an intersection of 52 unions of 2 elements, we would need to produce 4.5e15 resulting top-level union elements. That's _way_ too many. We calculate that number in advance, see that it's greater than our cap of 100000, and issue the error in the OP.

Now, this specific case, when we're distributing a ton of unions, all of which contain undefined, I think we could preemptively simplify to greatly reduce that number, since undefined can't be subtyped under our current rules (undefined intersected with pretty much anything else is always never). I posited as much when we originally added the limit, and it looks like we do have reason to do so. It's probably worth noting that what I'm thinking of'll only help situations where every input is a union containing undefined (and maybe null), though. :S

My question: if normalization was really producing that many types, why did this work before instead of, you know, causing OoM?

Because the type simplifies massively upon construction - undefined and null, intersected with anything, are just never, so for every combination we made, we'd just get never, then discard that element and move on (which'd chew thru compute time but not really memory). Or that's what we _would_ have done, had we been making the type prior to 3.5 - prior to 3.5 we were just not making the constraint correctly and unsoundly checked against a union of unions, so it never came up.

Was this page helpful?
0 / 5 - 0 ratings