I've tryed to fetch and process several urls simultaneously but i've found there is no case to call yields if your tasks array is variable size.
You should know the exact size of the array and call it like this
yield [
call(fetch, '/users'),
call(fetch, '/repos')
]
I found there is the only one way to do it. You simply fork your fetches, collect tasks and then manually joins them.
for (let i = 0; i < urls.length; ++i) {
tasks[i] = yield fork(parallelFetch, urls[i], actions);
}
for (let i = 0; i < tasks.length; ++i) {
yield join(tasks[i]);
}
and the parallelFetch is
function* parallelFetch(url, actions) {
try {
let data = yield call(fetch, url);
data = yield data.json();
yield put(actions.success(data));
} catch (e) {
yield put(actions.error(e));
}
}
Is there any other possibilities to do it?
Sorry for my english.
You can do
yield urls.map( url => call(parallelFetch, ...) )
Thank you. I forgot what call function returns plain object and tried to build complex constructions with yield operator inside map function.
@yelouafi I'm loving your work these past few months especially. I came up with something similar to what you have here
yield urls.map( url => call(parallelFetch, ...) )
But I'm needing the group of calls/forks to block eventually as a single task so that they don't block each other but in the end once they all finish then as a group they block.
yield [
urls.map( urls=> fork(parallelFetch, ...) )
]
What's your opinion on this? Do you think there is a better way? Is this redundant?
Most helpful comment
@yelouafi I'm loving your work these past few months especially. I came up with something similar to what you have here
yield urls.map( url => call(parallelFetch, ...) )But I'm needing the group of calls/forks to block eventually as a single task so that they don't block each other but in the end once they all finish then as a group they block.
yield [ urls.map( urls=> fork(parallelFetch, ...) ) ]What's your opinion on this? Do you think there is a better way? Is this redundant?