fastapi's async/await does not provide concurrency with pytorch library

Created on 9 Oct 2020  路  3Comments  路  Source: tiangolo/fastapi

Hello dear fastapi community,

my problem is that all requests will be blocked until one function in a route ends although the async and await are declared.

I trigger the training process of an NN model by a request and use the pytorch library for that. This process blocks all other requests.

Is there any solution for that? I unfortunally could not find something which helps me.

Thanks in advance!

question

All 3 comments

You should run the model in a different thread, by calling it from a synchronous endpoint

use : starlette.concurrency.run_in_threadpool

tuto -> https://bocadilloproject.github.io/guide/async.html#tl-dr

or a backgroung task : https://fastapi.tiangolo.com/tutorial/background-tasks/

close your issue if you do not have a feature request or fastapi bug

thank

will try it out today, thank everyone!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

kkinder picture kkinder  路  3Comments

zero0nee picture zero0nee  路  3Comments

scheung38 picture scheung38  路  3Comments

tsdmrfth picture tsdmrfth  路  3Comments

RogerioDosSantos picture RogerioDosSantos  路  3Comments