TypeScript Version: 2.1.4
Code
Current, with --strictNullChecks
on, the typescript compiler seems to treat
interface Foo1 {
bar?: string;
}
and
interface Foo2 {
bar?: string | undefined;
}
as being the same type - in that let foo: Foo1 = {bar: undefined};
is legal. However, this does not seem to be the original intent of the language - while some people have used ?
as a shortcut for | undefined
(e.g. https://github.com/Microsoft/TypeScript/issues/12793#issuecomment-266095767), it seems that the intent of the operator was to signify that the property either did not appear, or appeared and is of the specified type (as alluded in https://github.com/Microsoft/TypeScript/issues/12793#issuecomment-266092763) - which might not include undefined
- and in particular it would let us write types that either have a property that, if enumerable is not undefined, or is not present on the object.
In other places it might make sense for ?
to keep serving as a shorthand for | undefined
, e.g. for function parameters. (or maybe for consistency it would make sense to stick with it meaning "always pass something undefined if you pass anything"? not sure what's best there. Happy to discuss.)
Expected behavior:
interface Foo1 {
bar?: string;
}
function baz(bar: string) {};
let foo: Foo1 = {bar: undefined}; // type error: string? is incompatible with undefined
if (foo.hasOwnProperty("bar")) {
baz(foo.bar); // control flow analysis should figure out foo must have a bar attribute here that's not undefined
}
if ("bar" in foo) {
baz(foo.bar); // control flow analysis should figure out foo must have a bar attribute here that's not undefined
}
Actual behavior:
interface Foo1 {
bar?: string;
}
function baz(bar: string) {};
let foo: Foo1 = {bar: undefined}; // compiler is happy
if (foo.hasOwnProperty("bar")) {
baz(foo.bar); /* error TS2345: Argument of type 'string | undefined' is not assignable to parameter of type 'string'. Type 'undefined' is not assignable to type 'string'*/
}
if ("bar" in foo) {
baz(foo.bar); /* error TS2345: Argument of type 'string | undefined' is not assignable to parameter of type 'string'. Type 'undefined' is not assignable to type 'string'*/
}
I agree, in JS there is a difference between the two
const a = { prop: undefined };
a.prop // undefined
a.hasOwnProperty('prop') // true
'prop' in a // true
for (k in a) {
console.log(k) // logs 'prop'
}
const b = { };
b.prop // undefined
b.hasOwnProperty('prop') // false
'prop' in b // false
for (k in b) {
console.log(k) // never hit
}
I wrote that comment you referenced. I go back and forth on this. The devil's advocate position is that, under the proposed stricter interpretation of ?
, I'll still be able to assign {}
to Foo
, and thus, I _still_ must treat foo.bar
as string | undefined
at all places in my code. The only thing I can see being achieved with this higher level of strictness is a new way of type guarding a value for which a perfectly good type guard is already available.
Instead of looking at keys, you can accomplish the type inference you want right now by looking at values.
interface Foo {
bar?: string;
}
function baz(bar: string) {};
let foo1: Foo = {bar: undefined}; // would become type error: string? is incompatible with undefined
let foo2: Foo = {}; // Would still, however, be allowed
if (typeof foo1.bar !== 'undefined') {
baz(foo1.bar); // Control flow already correctly narrows type to string
baz(foo2.bar); // Control flow correctly warns that this could be undefined (under strictNullChecks)
}
So, if there is a specific key in mind, there is already a clean way of type guarding it.
Is the difference more pertinent when enumerating a whole object by keys? I couldn't construct a case in which it is.
for (let k of foo) {
let val = foo1[k]; // Would still be any under strictOptionalMember, and still an error under noImplicitAny
}
Possibly some meaningfully different expression could be constructed using lookup types. But I almost think it's a code smell to care about whether a key exists on an object.
The bird's eye view is that JavaScript _intends_ for the developer to equivocate between "no key" and "key: undefined". Thinking about the existence of a key is generally not a very useful paradigm in JavaScript. Instead, what really matters are the possible types of the values. So while this new way might be more faithful to the mechanics of JavaScript, it's not more faithful to the intended usage of JavaScript.
I have met JavaScript libraries that used 'prop' in options
or options.hasOwnProperty('prop')
to check for the presence of an option, and then when you pass undefined
(because you just pass your own optional parameter for example) it results in bugs TypeScript didn't catch.
That's a strong argument. It's good for TypeScript's type system to be able to express whatever is possible in JS, whether or not those possibilities are a good idea.
yeah... if I was designing javascript from scratch, I'd probably leave out the whole concept of undefined
. But it exists in js, and I don't think typescript is willing to try to change that, so we have to live with it. Making it possible to distinguish, in the type system, between an attribute existing or not without making that attribute optionally undefined would be useful, since we are working with a language that has these (perhaps unnecessary) complexities.
The current missing/undefined convolution does seem potentially problematic...
const x: {a?: string} = {a: undefined}; // expect error, but passed
Is it possible to introduce void
to mean "missing" and leave undefined
as undefined? So { foo?: string }
would be equivalent to { foo: string | void }
?
Where does this issue currently stand? Is it something under consideration, or are there reasons why we can't fix this?
Another example where TypeScript's inability to distinguish missing from undefined
leads to inconsistencies between the types and JavaScript's actual behaviour is usage of object spread.
A real world example of this pattern is a withDefaults
higher-order React component.
type Props = {
foo: string;
bar: string
}
type InputProps = {
foo?: string;
bar: string;
}
const defaultProps: Pick<Props, 'foo'> = { foo: 'foo' };
const inputProps: InputProps = { foo: undefined, bar: 'bar' };
// Type of `foo` property is `string` but actual value is `undefined`
const completeProps: Props = { ...defaultProps, ...inputProps };
$ node
> const defaultProps = { foo: 'foo' };
undefined
> const inputProps = { foo: undefined, bar: 'bar' };
undefined
> const completeProps = { ...defaultProps, ...inputProps };
undefined
> completeProps
{ foo: undefined, bar: 'bar' }
Ideally, this would throw a type error:
// `{ foo: undefined; bar: string; }` is not assignable to `{ foo?: string; bar: string; }`
const inputProps: InputProps = { foo: undefined, bar: 'bar' };
On the other hand, if TypeScript did purposely not distinguish between missing and undefined, TypeScript should instead emit an error when spreading:
// `{ foo: string | undefined; bar: string }` is not assignable to `Props`
const completeProps: Props = { ...defaultProps, ...inputProps };
Another real world example where this matters.
React elements have their className
prop marked as optional. However, this also allows undefined
to be passed as a value, resulting in HTML of class="undefined"
:
<div className={undefined} />
// creates <div class="undefined" />
I believe this is related to an issue I'm having exporting a hoc withRouter
from a component library to another project (using TS 2.8.3).
This hoc has own props like (notice that prop3
is optional
):
type OwnProps = {
prop1: string;
prop2: string;
prop3?: string;
};
which causes the component to be default exported in the compiled code as:
React.ComponentClass<Pick<OwnProps, "prop1" | "prop2" | "prop3" | undefined>>;
Raising the error:
Type 'undefined' is not assignable to type '"prop1" | "prop2" | "prop3"'.
I had to work around making the optional props mandatory, explicitly accepting undefined as their union type:
type OwnProps = {
prop1: string;
prop2: string;
prop3: string | undefined;
};
Does this make sense? Is this still up?
It would be useful now to be able to define whether a property should be on an object or not based on some conditional type.
type Foo<CK extends string, NK extends string> = {
...
extraMappings: Exclude<NK, CK> extends never ? missing : {something: string}
}
You can use undefined
currently but as discussed it's not quite the same.
Would this also address this use case?
interface Foo<T = undefined> {
bar(a: T extends undefined ? missing : T): void;
}
Currently no way I can find to make bar conditionally require an argument
@robbiespeed well following ES6 semantics that should really be made possible with just
interface Foo<T = undefined> {
bar(a: T): void
}
because according to ES6, passing undefined
to a parameter is equivalent to passing no parameter - if you declared a default value for example, passing undefined
will result in the default value being used, which would not be true for e.g. passing null
.
@felixfbecker this is not how it currently works, Typescript will complain stating Expected 1 arguments, but got 0.
Also foo()
and foo(undefined)
are not exactly equivalent. Take the following example:
function foo () { return arguments.length; }
foo() // returns 0
foo(undefined) // returns 1
For arrow functions the semantics would be the same though since arguments
is not accessible.
I'm on the fence about whether Typescript should function according to the exact semantics or not. It's a bit of an edge case, and as far as I know use of arguments
is not something Typescript makes an effort to account for anyway.
not sure how we arrived at talking about arguments, but an arguments-less example:
function foo(...args: any[]){ return args.length; }
foo(); // returns 0
foo(undefined); //return 1
IMO that's less ugly that using arguments
directly, but it'll transpile to arguments if your target is es5.
But anyhow this topic was originally about missing keys vs present keys with an undefined
value within an object. That feels like a different case than missing vs. present arguments to functions - though if we can figure out a way to unify the two concepts in the type system I'm all for it.
I would like to see this ability added. I commonly use an object as a dictionary, mapping strings to values.
interface Dictionary<T> {
[key: string]: T;
}
In this case you are not sure which keys are in the object, but you know that if they are in the object they have a value of type T
(i.e., not undefined
). So what I want is this:
const d: Dictionary<number> = getDictionary();
const name: string = getName();
++d[name]; // should be an error; name may not be in d
// (currently accepted by TypeScript)
if (name in d) {
++d[name]; // should be OK; if name is in d, its value will be a number
}
for (let k in d) {
d[k]++; // should be OK; every key in d has a number value
}
Currently TypeScript does not flag the first increment as an error. I can understand that: it is necessary for TypeScript to assume you are using a valid key; otherwise using arrays would be tiresome. However, it would be nice if I could use the following type to mean that some keys may be missing, not to mean that some keys may have an undefined value.
interface Dictionary<T> {
[key: string]?: T;
}
This currently does flag the first increment as an error, as desired. However, it also flags the second and third increments as errors since it assumes their values may be undefined. If TypeScript distinguished between missing keys and keys with an undefined value then I believe this second type would do what I want.
Hmm, in #24897 (tuples in rest/spread positions):
Optional elements in tuple types
Tuple types now permit a
?
postfix on element types to indicate that the element is optional:let t: [number, string?, boolean?];
That makes me wonder if we might use T?
(where T
is a type) everywhere to mean "an optional parameter/property whose type, if not missing, is T
". So we allow the ?
to jump from the parameter/key name to the type. That is, the following function signatures are nearly equivalent:
declare function foo(a?: string): void;
declare function foo(a: string?): void;
(modulo whether you want one or both of them to disallow foo(undefined)
) and the following types are nearly equivalent:
type Foo = {a?: string};
type Foo = {a: string?};
(modulo whether you want one or both of them to disallow {a: undefined}
). And the missing
type would be equivalent to never?
.
I find myself wanting this feature more now, since #26063 (mapped arrays/tuples), as away to programmatically add/remove optionality from tuples.
Unfortunately the optional tuples suffer from the same problem:
const a: [number, string?, boolean?] = [1];
const b: [number, string?, boolean?] = [1, undefined];
const c: [number, string?, boolean?] = [1, "string", undefined];
const d: [number, string?, boolean?] = [1, undefined, undefined];
I do think their is mileage in the syntax T?
though I think it should just be a type with a meaning that is invariant of context. T?
should say nothing about properties or arguments and be valid anywhere a type is. I think that T?
is allowed in JsDoc types, though I'm not sure how this plays with TS.
Some my thinking would be something like: in strictMissingMode
then
{ x?: string }
means either the property is not present, or the property is explicitly present and the value is of type string
. The ?
is an attribute of the property, not the type string
. Here string
means exactly what it says.{ x: string? }
means that the property is explicitly present but the value is either of type string
or undefined
. The ?
is an attribute of the type so string?
means the same thing in all contexts. Essentially string? === (string | undefined)
.The same would apply to function parameters, with the following behaviour.
type X = { x?: string };
const a: X = {}; // valid
const b: X = { x: "x" }; // valid
const c: X = { x: undefined }; // invalid
type Y = { y: string? };
const d: Y = {}; // invalid
const e: Y = { y: "y" }; // valid
const e: Y = { y: undefined }; // valid
function foo(x?: number): {}
foo() // valid
foo(4); // valid
foo(undefined) // invalid
function bar(x: number?): {}
bar(); // invalid
bar(4) // valid
bar(undefined) // valid
So the existing type { x?: string }
in vanilla TS would be equivalent to { x?: string? }
under the strict mode, which it sort of gets expanded out to anyway. In the strict mode, { x?: T }
and { x: T? }
coincide only when the value has the property x
of type T
.
Additionally I think the strict mode should do away with the unsound optional property weakening rule that says: { ... }
is assignable to { ...; y?: string }
if the former doesn't have a conflicting y
property. Instead the rules should be:
{ x: T }
is assignable to { x?: T }
.{ ... }
is assignable to { ...; y?: unknown }
.The reverse can be applied through in
narrowing such that:
a : { x?: T }
then "x" in a
narrows to { x: T }
.a : object
then "x" in a
narrows to { x?: unknown }
, or some suitable intersection type.The strict mode could also have some interesting interactions with never
. Right now a type { x: never }
is sort-of meant to mean that x
does not exist, though the type should really be isomorphic to never
. The type { x?: never }
says either the property does not exist, or it exists with type never
, so only the former can be true. The type { [K in string | number | symbol]?: never }
might represent the true empty object.
For anyone still waiting for optional function arguments, it is now possible to simulate that using new tuple types and spread expressions.
type OptionalSpread<T = undefined> =
T extends undefined
? []
: [T]
const foo = <T = undefined>(...args: OptionalSpread<T>): void => {
const arg = args[0] // Type of: T = undefined
}
// undefined inferred
foo() // OK
foo(42) // OK <----single argument type is inferred, can't do anything about it
// undefined explicit
foo<undefined>() // OK
foo<undefined>(42) // ERROR Expected 0 arguments, but got 1.
// number
foo<number>(42) // OK
foo<number>() // ERROR Expected 1 arguments, but got 0.
foo<number>("bar") // ERROR Argument is not assignable to parameter of type 'number'.
it has a limitation with inferred argument type though, which is solved by explicitly specifying undefined argument type
Typescript version 3.3
strictNullChecks
enabled:
type UncertainShape = {
uncertainProp?: number;
certainProp: number;
}
declare var a: UncertainShape;
var b: Record<string, number> = a;
// here is error, since number | undefined cannot be assigned to number
// but should be okay since both variants of having and not having uncertainProp
// presented in the shape should still match the shape of Record<string, number>
@shmidt-i My understanding is that if you declare a variable of type Record<string, number>
, then whatever property you will take from it, it should return a number
. Your type UncertainShape
is explicitly against that assumption that's why it throws an error.
Here you can see a similar example. What is in my opinion misleading, is that second line is OK. I personally would prefer to get error in both situations.
This keeps coming up from time to time. We'd like to see a PR that implements this so we can roughly estimate its complexity and see what the breakage on Real World Code is. It shouldn't be a super-difficult PR to sketch out the basics for.
Basic possible outcomes:
--strict
with an opt-outwe have over 3500+ files project and never run into this problem, however you put it, typescript has enough expression power to please you, not sure why it draws so much attention
Started here: #30796
I'm considering the idea of using a TSDoc tag @cannotBeUndefined
plus a custom ESLint rule to handle this problem. (https://github.com/microsoft/tsdoc/issues/163) It came up when creating adapting legacy JavaScript files to work with TypeScript.
My main issue is with the unsoundness of the spread syntax:
const x: {foo?: number} = {foo: undefined}
const y: {foo: number} = {foo: 123, ...x}
console.log(y) // y is now {foo: undefined}
@michalburger1 as a workaround, i've been using JSON.stringify
to get rid of the undefined
values, before merging
interface options : { a? : number }
const defaultOptions : options = { a : 123 }
const userOptions : options = { a: undefined }
const mergedOptions : options = {
...defaultOptions ,
...JSON.parse(JSON.stringify(userOptions))
}
console.log(mergedOptions) // --> {foo: 123}
It may not be a solution to cover the 100% of the scenarios but so far covers most of them (at least for me)
If I'm not mistaken, the importance of the distinction is most important for mapped types, e.g. for a patchObject()
function.
But having ?
cover undefined
as is currently the case is also useful, e.g. for monomorphic object factories.
interface Foo<T> {
bar: T,
baz?: T
}
function makeFoo<T>(bar: T, baz?: T):Foo {
return {bar, baz}
}
The former pattern is probably more common than code that requires making the distinction between missing
and undefined
. It is important because it lets JS engines optimize objects with a hidden class, whereas adding or deleting keys triggers slow code paths.
So the current semantics give nice ergonomics to a pattern that's useful in practice.
So it would be nice to let users have it both ways. The | void
suggestion above would be a possibility, but the presence of a key is part of the type of the object, and that syntax makes it look like it's part of the type of the value. Hence my proposal in #26438 to introduce a ?!
suffix that would introduce the semantics proposed here, and keep ?
as it is now.
This would also be backwards compatible without the need of a "codemod" script to update every optional property.
FYI a small extra example when it is needed.
It is a necessity when you are working with Firestore. Firestore doesn't allow you to store undefined
values and responds with error. Currently TS doesn't catch an explicit undefined
values you accidentally pass to Firestore. You usually can do it intentionally or using spread syntax on object.
For this particular case I have to handle that via autotests (I can mock firestore client and track that we never pass an explicit undefined
value in tests). But that is not ideal (it is difficult to make a full tests coverage) and I would prefer TS to track that.
@Artemnsk regardless of TS, here's an idea which may help you: wrap each firestore function you use in your own, where you'll remove all properties with a value of undefined
. This will work at runtime and won't require tests.
@elektronik2k5
I would not recommend to do this particular thing on backend. Sometimes thrown error is better than saving [potentially] invalid data. Some value can be set to undefined
accidentally (e.g. instead of null
/0
/""
/false
or hundreds of other reasons).
That's why we love TS - once you covered your code with a good types you don't need to take care of such things (write an extra programmatic layer for validation and/or autotests).
Probably(!) Firestore library contributors will be the first who apply this TS feature in their codebase. And I'm quite sure they intentionally do not suppress undefined
values on their library side before saving.
Note that void
meaning "missing" has already been implemented for function parameters (with no trailing required parameters) in #27522:
function foo(x: string, y: number | void, z: boolean | void) { }
foo("a", 1, true); foo("a", 1); foo("a"); // okay
So that's closer to this being a real thing. Of course there are still some outstanding bugs here (#29131); and it doesn't yet work on object properties; and we can still call foo("a", undefined)
so the part where this distinguishes "missing" from undefined
isn't there. But it's a start, right?
@jcalz I know 100% soundness isn't a goal but this made me chuckle
function foo(x: string, y: number | void, z: boolean | void) { }
const f: () => void = () => "i am a string";
foo("x", f()); //Compiles
Because void
doesn't mean "missing" or undefined
.
It means "a value exists but you should not use it" or something to that effect.
@AnyhowStep that's a really good point, it has two conflicting meanings right now.
In f
: We don't care about the value, it could be anything
In foo
: The value is missing and evaluates as false
function doThingWithNum(n: number) {}
function foo(x: string, y: number | void, z: boolean | void) {
if (y) {
doThingWithNum(y); // works, but should error because we haven't actually proven it's a number
}
}
foo("a", 1, true); foo("a", 1); foo("a"); // okay
const f: () => void = () => "i am a string";
foo("x", f());
@jcalz this seems like a really bad bug, especially if people are using void
to mean missing.
What's worse is there doesn't seem to be a way to fix this without breaking backwards compatibility
I don't think is a problem that is ever going to be solved by adding a special type - it really needs to be pushed up a level to be a property of fields and arguments.
void
is broken for a variety of reasons; I hope TypeScript 4.0 can readdress this now that void
is largely redundant with unknown
.
@jack-williams
The issue issue with having it be a property of fields and arguments, is cases where we need to allow missing generic arguments. Which if allowed would essentially create a special missing type in the context of that generic. So why not make it a special type to begin with.
I'm not sure I really agree that generics are an issue. IMO it just seems weird to ask a value if it is missing by consulting its type; by definition you have a value so it can't be missing! The only thing that knows whether there should be a value there or not is the container, be it an object or function.
Once you have a value, of possibly generic type, the concept of missing has already been lost. The semantics of the container determine how to interpret missing.
What about a ‑‑noImplicitUndefined
compiler flag that would make optional parameters and properties not automatically allow undefined
.
@jack-williams I should have clarified that I was speaking of generic interfaces and classes as in this example:
interface GenericTransaction <Input, Result> {
// take possible input and return result
run(input: Input): Result
}
const foo: () => void = () => 'Hello'
const t: GenericTransaction<undefined, undefined> = {
run() {
return undefined;
}
}
t.run(); // expects 1 argument even though we want that to be optional
t.run(foo()) // void is not assignable to undefined, which protects us from unsafe void
const t2: GenericTransaction<void, undefined> = {
run() {
return undefined;
}
}
t2.run(); // this works
t2.run(foo()); // but has the issue of accepting unsafe void
Transaction in this case could also be an abstract class that has a lot of internal logic for running transactions, so creating a separate interface or class for the empty input case wouldn't be ergonomic.
There are 3 options:
undefined == missing
makes sense as the value will always be undefined if the argument is missing.What about a
‑‑noImplicitUndefined
compiler flag that would make optional parameters and properties not automatically allowundefined
.
That's a sane scenario. It will become contagious quickly though, because if one of your dependencies relies on it, you'll have to add it to your project as well, because the meaning of
interface Foo {
a? : number
}
will differ across code bases, and types that would otherwise have been structurally compatible won't be anymore. Also, error reporting would have to communicate whether Foo
has --implicitUndefined
as part of its type, or it will cause head scratching.
Found this issue when trying to understand a TypeScript behavior when merging two objects using spread operator, one of which has optional fields:
https://stackoverflow.com/questions/60626844/spread-operator-and-optional-fields-how-to-infer-proper-type
Please verify whether my theory is correct (see my answer on SO).
I came to conclusion that I should stop using optional fields until all the related bugs are fixed.
If I really need optional fields, I should use explicit |undefined
with explicit casting to and from partials (e.g. createFromPartial
and omitUndefined
). Still working on signatures of these typecasting functions. Help is appreciated.
createFromPartial
should take Partial<T>
and return an object which fields are converted to { [K in keyof T]: T[K] | undefined }
That is createFromPartial<{ value: number }>({ })
should create an object with type { value: number | undefined }
omitUndefined
should take { value: number | undefined }
and convert it to { value: number }
Still, not an ideal solution, as { ...{ }, ...omitUndefined<MyType>({ value: undefined }) }
would be treated differently from what happens in runtime. That is would be inferred as { value: number }
while in runtime would be an empty object. But should be better suited to catch bugs than current behavior, where merging { value?: number }
with { value: number }
would give you { value: number }
no matter the order.
if somebody wants to verify current behavior.
So, the fix should include:
never | T
instead of never | undefined
.undefined
to optional. That is undefined
should not be assigned to never | T
if T
doesn't include undefined
delete
should only be allowed for fields of type that includes never
and some other type. It should not be allowed to delete
a field of type undefined
.field: never | number
should become field: number
after delete a.field
operation. But if you're passing the reference to an object, of course its type shouldn't change, because it doesn't change in runtime.It's debatable what should happen when assigning an optional field to a new variable.
Should the result type be never | T
or never | T | undefined
. We can't really know which is it. If the field is optional and was not present it would be undefined
in runtime. But this behavior should be an error usage of a field. So maybe it's a warning? E.g. "please use type guard to ensure the property is present in the object". Well, most functions don't support never
type as an argument, so maybe it doesn't matter. You would need a type guard anyway.
Disallow deleting fields that are non-optional.
big đź‘Ž on this one (it should only be checked on calls/declarations/assignations)
delete
is off topic
@Mouvedia Let's discuss delete
here: https://github.com/microsoft/TypeScript/issues/13783
@Vanuan never | T
is T
. I get what you are trying to do, but unfortunately never
is not meant to model this situation.
The crux of the problem, unfortunately, is that the type system can't model this today; not that the syntax for it is overly verbose. @masonk hit the nail on the head in https://github.com/microsoft/TypeScript/issues/13195#issuecomment-269620662
I wrote that comment you referenced. I go back and forth on this. The devil's advocate position is that, under the proposed stricter interpretation of
?
, I'll still be able to assign{}
toFoo
, and thus, I _still_ must treatfoo.bar
asstring | undefined
at all places in my code.
part of the problem here is that typescript's interfaces are open. For instance (using a missing
type to indicate optional key/value pairs, without implying undefined
is a legal value):
interface Foo1 {
bar: string | missing;
}
let f: Foo1 = {}; // valid
let g: {} = {bar: undefined}; // valid
f = g // should be an error, but {} is a valid Foo1
If we try to model Foo1
as {bar: string} | {}
we have the problem that {}
is short for "an interface with no fields, ... or with arbitrary keys and values.
As for a path forward, I see two options:
a) try to tackle this in combination with some sort of "closed" concept for interfaces, or for single keys on interfaces
b) try to tackle this similarly to the "object literals don't have excess properties" checks, which avoids the need for working this thoroughly into the algebra of the type system
in (a) we would want something like
interface Foo1 {
bar: string | missing;
}
let f: Foo1 = {}; // valid
let g: {} = {bar: undefined}; // valid
f = g // Error: open interface {} isn't a valid Foo1 because it can have a bar
let h: BanKeys<{}, "bar"> = {bar: undefined}; // error: object has banned "bar" key
let i: BanKeys<{}, "bar"> = {}; // ok
i.bar = undefined; // error: i may not have "bar" key
f = i; // ok
An alternate version, using a special missing
type for everything instead of BanKeys<>
that I made up
interface Foo1 {
bar: string | missing;
}
let f: Foo1 = {}; // valid
let g: {} = {bar: undefined}; // valid
f = g // Error: {} isn't a valid Foo1 because it can have a bar, and bar
let h: {bar: missing} = {bar: undefined}; // error: object has banned "bar" key
let i: {bar: missing} = {}; // ok
f = i; // ok
i.bar = undefined; // error: i may not have "bar" key
I'm less convinced by the general usefulness of option (b). There, we would get similar safety as option (a), but only on object literals passed directly to the libraries using missing
in their type annotations. In practice I'd expect this would sometimes be useful, but probably not sufficient for most cases.
@Vanuan never | T is T. I get what you are trying to do, but unfortunately never is not meant to model this situation.
Yeah, I wanted this: let o: { field: never } = { };
. Apparently, this is invalid code. So there's no way to model a type of an object which is guaranteed to not have a given key or no keys at all.
That is, there's no difference between plain objects and duck-like objects.
Which means TypeScript is still inherently a duck-typed language.
Closed #38624 in favor of this one.
I'll add a note on this issue.
Here's yet another example that leads to a runtime error:
interface B {
b: number
}
interface A {
a: B;
}
const fn = (arg: Partial<A>): A => ({a:{b:2}, ...arg});
let aRes = fn({a:undefined});
console.log(aRes.a.b);
I expected the program to fail to compile because any prop in Partial<A>
can be explicitly set to undefined
, which overrides it's corresponding value when spreaded over another object of type A
. This causes the return value of fn
to not be described by A
, but instead by an interface similar to A
where all props can be undefined
as well.
But got: TypeError: "aRes.a is undefined"
instead.
Playground Link: https://www.typescriptlang.org/play?ts=4.0.0-dev.20200516#code/JYOwLgpgTgZghgYwgAgELIN4Chm+QIwC5kQBXAW32iwF8stRJZEUBBTHPOY1AblvoIA9iADOYZDBDIAvMgAUcKAHNiABSVhgcADYAeVgD4AlMXYzDCjNwxEATDQA0yAHRulymsf5YdECXAAShCispIg8taEpCAAJhAwoBCxXvzCYkJ+LjpCyorBoi5wLvjeQA
This issue also manifests in Object.values()
and Object.entries()
declare const rec: Partial<Record<string T>>;
Object.values(rec); // type: (T | undefined)[]
It would be nice to be able to safely describe an object that may be defined at a set of keys, but will never be defined as the value undefined
.
Edit: For reference to an actual use case here, we have some Redux structures that cache items-by-id. So we need a record-like structure indexed on an arbitrary id. If we want to list all of the items in this cache, we end up using Object.values()
.
I've hacked this by using tag-type :) May be this snippet will help anybody. Seems working for my case ...
converting this type
type Foo = {
optional?: 'my optional field';
required: 'this is required field';
};
into this type
type Foo = {
optional: 'my optional field' | undefined;
required: 'this is required field';
};
snippet with example
type MY_TAG_FOR_UNDEFINED_FIELD = { id: 'MY_TAG_FOR_UNDEFINED_FIELD' };
type Tagged <T> = { [K in keyof T]-?: Extract<T[K], undefined> extends never ? T[K] : T[K] | MY_TAG_FOR_UNDEFINED_FIELD };
type Untagged <T> = { [K in keyof T]: Extract<T[K], MY_TAG_FOR_UNDEFINED_FIELD> extends never ? T[K] : Exclude<T[K], MY_TAG_FOR_UNDEFINED_FIELD> | undefined };
type RequiredButUndefined <T> = Untagged<Tagged<T>>;
type Foo = {
optional?: 'my optional field';
required: 'this is required field';
};
type FooRequiredButWithUndefinedUnion = RequiredButUndefined<Foo>;
const fooWithOptionalOK: FooRequiredButWithUndefinedUnion = {
optional: undefined,
required: 'this is required field',
};
const fooWithOptionalOK_2: FooRequiredButWithUndefinedUnion = {
optional: 'my optional field',
required: 'this is required field',
};
const fooWithoutOptional_ERROR_HERE: FooRequiredButWithUndefinedUnion = {
required: 'this is required field',
};
I'm not sure if this is the same idea but we work a lot with a Result<T, E>
type in our Typescript - it's basically a port of Rust's Result enum type.
Today I wanted a function that in Rust would return a Result<(), E>
ie. if it fails we get the error, but on success there is no meaningful type for the value. This is the unit primitive.
Now on the surface this doesn't look like it has much to do with missing vs undefined, but let me write an example of what I've had to do to get this to work:
function foo(): Result<null, SomeErrorType> {
try {
operation()
} catch (_) {
return Err(new SomeErrorType())
}
return Ok(null);
}
There is no type I can supply as the T (success) paramater to result that allows me to do return Ok()
. I think if I was able to type the return of that function as Result<missing, SomeErrorType>
then that would allow me to return an Ok without any parameters.
@tom-sherman You can do it if you constrain T
to Array<any>
since function arguments are a tuple that extends Array. See example below. You can probably also do it without the constraint by using conditional types or function overloads. It is a trade-off between cosmetic beauty and simplicity of the implementation.
type Result<T extends Array<any>, E> = {t: T}|{e: E}
function Ok<A extends Array<any>>(...t: A): Result<A, never> {
return {t}
}
const foo: Result<[], Error> = Ok()
const bar: Result<[number], Error> = Ok(123)
Most helpful comment
I agree, in JS there is a difference between the two