I have noticed in several situations that after type refining a value, using that value inside of a closure causes Flow to lose track of the refinement. For example, this comes up often with Redux-style action refinement:
type Action = {
type: "FOO",
payload: string
} | {
type: "BAR",
};
const reducer1 = (action: Action) => {
switch (action.type) {
case "FOO":
return action.payload.length;
}
};
const reducer2 = (action: Action) => {
switch (action.type) {
case "FOO":
return ["a", "b"].filter(item => item === action.payload);
}
};
reducer2 will cause an error when accessing action.payload.
An even simpler example:
const fn = () => {
let foo: ?string;
foo = "hello";
if (!foo) return; // type refinement
console.log(foo.length); // no error
Array(3).map(x => foo.length); // flow error
};
Is there a purposeful reason I am missing that the refinement is invalidated inside of the function? Or is the type incorrectly getting lost?
I think this is a duplicate of https://github.com/facebook/flow/issues/5721
I'm not sure it's a duplicate. I've just encountered something similar, I've tried to distill it down to the smallest example (try flow):
const foo: Array<?string> = [ 'a', 'b', null, 'd' ]
const bar: Array<string> = foo.filter((x) => x != null)
4: const bar: Array
= foo.filter((x) => x != null)
^ array type. Has some incompatible type argument with
4: const bar: Array= foo.filter((x) => x != null)
^ array type
Type argumentTis incompatible:
3: const foo: Array = [ 'a', 'b', null, 'd' ]
^ null or undefined. This type is incompatible with
4: const bar: Array= foo.filter((x) => x != null)
^ string
After looking at this I don't think it's completely related, but it does seem that Flow is not understanding the type refinement coming out of the closure.
@scull7 Flow can't refine type of array by filter function. It's not about closure. Solution is using flatMap (from lib, native support coming soon).
declare function flatMap<A, B>($ReadOnlyArray<A>, A => $ReadOnlyArray<B>): $ReadOnlyArray<B>
const foo: $ReadOnlyArray<?string> = [ 'a', 'b', null, 'd' ]
const bar: $ReadOnlyArray<string> = flatMap(foo, (x) => typeof x === 'string' ? [x] : [])
now type can be refined well
@scull7 talking about particular case, you can use
const foo: Array<?string> = [ 'a', 'b', null, 'd' ]
const bar: Array<string> = foo.filter(Boolean)
And flow will not complain.
@apsavin Even with sketchy-null check?
@TrySound heh, I'm not 100% sure, but I think yes. See https://github.com/facebook/flow/blob/master/lib/core.js#L203
Flow doesn't support this at the moment. The reason is because JavaScript's filter function returns a subset of the elements of type T of the source array of type Array
In your scenario, your implementation of the predicate passed to the filter function ensures that a subset of type T is returned. But there's no way for flow to "flow" that type information back out to the calling method.
What you would need is something like this: https://www.typescriptlang.org/docs/handbook/advanced-types.html#user-defined-type-guards, and a way for the type definition of Array#filter to be able to understand that type guard via generics. This is something TypeScript currently supports. The code would look like this:
const foo: Array<string | null | undefined> = [ 'a', 'b', null, 'd' ]
const bar: Array<string> = foo.filter((value: string | null | undefined) : value is string => value != null)
Flow doesn't support custom type guards at all: https://github.com/facebook/flow/issues/34
@apsavin That's not going to do what the OP wanted. If you have empty string in the original array then it won't be included in bar.
@apsavin Heh, I think it's a bug.)
@TrySound I don't think that being able to filter by Boolean is a bug... You can pass anything to Boolean and it will return true/false, which is what filter needs. It's just not behaviourally the same as what @jjshammas wanted.
@massimonewsuk If sketchy-null is enabled statement like if (maybeString) {} will be warned. With Boolean it won't. So empty lines will be filtered even if they are valid. And flow won't warn you.
Oh I get what you mean... sorry I had no idea what sketchy-null was.
Perhaps my usage of filter sidetracked the convo a bit, but I think the second short example highlights the closure issue well:
const fn = () => {
let foo: ?string;
foo = "hello";
if (!foo) return; // type refinement
console.log(foo.length); // no error
Array(3).map(x => foo.length); // flow error
};
I realized after thinking about this some more that Flow does not know whether the closure is invoked immediately, thus its type invalidation could be valid. For example, here we don't call the closure immediately and thus will result in an exception:
const fn2 = (closure: Function) => {
setTimeout(closure, 100);
}
const fn = () => {
let foo: { bar?: string } = { bar: "qux" };
if (!foo.bar) return; // type refinement
console.log(foo.bar.length); // no error
fn2(x => foo.bar.length); // at closure creation, we know foo.bar exists
delete foo.bar; // hope you invoked the closure already!
};
That being said, is there a way Flow can improve its knowledge of what closures will and will not be invoked synchronously? This may be a pipe dream for user-defined functions like the one above, since without that obvious setTimeout it may be difficult to infer when the final closure invocation really is. But at least for native functions like Array.map, Array.filter, etc鈥攜ou know and I know that the closure will get invoked synchronously.
Plus鈥攖his issue should only be relevant when a value is passed by reference into a closure. My short example using foo as type string should not cause an issue, since there's no way I can imagine that the closure would get invoked and suddenly not have a string.
Another simple example here (flow.org/try)
/* @flow */
function test(obj: ?Object) {
if (obj) {
// error - thinks obj may be null or undefined
Object.keys(obj).map(key => obj[key]);
// passes
Object.keys(obj).map(key => obj && obj[key]);
}
}
Error:
6: Object.keys(obj).map(key => obj[key]);
^ Cannot get `obj[key]` because an indexer property
is missing in null or undefined [1].
References:
3: function test(obj: ?Object) {
^ [1]
Is the way to work around this now just to use a // $FlowFixMe on the line above the error line? Seems unnecessary and wrong to add an additional obj && before reading obj[key].
@pbeshai your example highlights where this is likely completely incorrect behavior. Even if, hypothetically, the callback you passed to .map was not invoked until seconds later, there is no way for the reference to obj to get redefined such that it would be null and cause an exception.
Ran into the same issue as @pbeshai
Extremely contrived example:
function test(a?: Array<string>) {
if (a === undefined) {
return [];
}
return a.map((str, index) => {
return a[index];
});
}
9: return a[index];
^ Cannot get `a[index]` because an indexer property is missing in undefined [1].
References:
3: function test(a?: Array<string>) {
^ [1]
There's no way for a to be undefined at this point.
Any updates on this? Running into similar cases and have resorted to $FlowFixMe or redundant checks
Works when assigning to a local variable before the closure:
Most helpful comment
Ran into the same issue as @pbeshai
Extremely contrived example:
There's no way for
ato beundefinedat this point.Flow Try