Redux-saga: Running tasks in parallel with variable task size

Created on 19 Feb 2016  路  3Comments  路  Source: redux-saga/redux-saga

I've tryed to fetch and process several urls simultaneously but i've found there is no case to call yields if your tasks array is variable size.

You should know the exact size of the array and call it like this

yield [
  call(fetch, '/users'),
  call(fetch, '/repos')
]

I found there is the only one way to do it. You simply fork your fetches, collect tasks and then manually joins them.

for (let i = 0; i < urls.length; ++i) {
    tasks[i] = yield fork(parallelFetch, urls[i], actions);
}
for (let i = 0; i < tasks.length; ++i) {
    yield join(tasks[i]);
}

and the parallelFetch is

function* parallelFetch(url, actions) {
    try {
        let data = yield call(fetch, url);
        data = yield data.json();
        yield put(actions.success(data));
    } catch (e) {
        yield put(actions.error(e));
    }
}

Is there any other possibilities to do it?

Sorry for my english.

question

Most helpful comment

@yelouafi I'm loving your work these past few months especially. I came up with something similar to what you have here

yield urls.map( url => call(parallelFetch, ...) )

But I'm needing the group of calls/forks to block eventually as a single task so that they don't block each other but in the end once they all finish then as a group they block.

yield [ urls.map( urls=> fork(parallelFetch, ...) ) ]

What's your opinion on this? Do you think there is a better way? Is this redundant?

All 3 comments

You can do

yield urls.map( url => call(parallelFetch, ...) )

Thank you. I forgot what call function returns plain object and tried to build complex constructions with yield operator inside map function.

@yelouafi I'm loving your work these past few months especially. I came up with something similar to what you have here

yield urls.map( url => call(parallelFetch, ...) )

But I'm needing the group of calls/forks to block eventually as a single task so that they don't block each other but in the end once they all finish then as a group they block.

yield [ urls.map( urls=> fork(parallelFetch, ...) ) ]

What's your opinion on this? Do you think there is a better way? Is this redundant?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

tobyl picture tobyl  路  3Comments

Zacqary picture Zacqary  路  3Comments

Moeriki picture Moeriki  路  3Comments

sompylasar picture sompylasar  路  3Comments

andersonbarutti picture andersonbarutti  路  3Comments