Fastapi: Functions with "async def" will block the "def" functions

Created on 8 Dec 2020  路  3Comments  路  Source: tiangolo/fastapi

I am builting web service with FastApi. Some functions have to be "await", so i wrote a function with "async def" named "get_list". But when the "get_list" functions was requested and executing, i request the "def" functions named "get_item" at the same time, the "def" function will be blocked by "time-consuming codes". I am confused.

So i want to know how to avoid the "async def" functions with time-consuming opertation inside blocking the "def" functions.

I use "for i in range(100000000)" to simulate some time-consuming opertation which will block the "def" functions.

# To simulate the time-consuming operation
for i in range(100000000):
    pass

All codes below:

import asyncio
import uvicorn

from fastapi import FastAPI

app = FastAPI()


@app.get('/get_item')
def get_item():
    # Will blocked by the 'get_list'
    return {"Key": "Value"}


@app.get('/get_list')
async def get_list():
    return await get_list_async()


async def get_list_async():

    # To simulate the time-consuming operation
    for i in range(100000000):
        pass

    print('start')
    await asyncio.sleep(10)
    print('end')

    # To simulate the time-consuming operation
    for i in range(100000000):
        pass

    ret_list = ['Item1', 'Item2', 'Item3']
    return ret_list


if __name__ == '__main__':
    uvicorn.run(app='main:app', host='0.0.0.0', port=8000, reload=True, debug=True)

question

Most helpful comment

As I said on #2484, async defare not functions.

What's happening on your get_list_async is that you have an expensive CPU-bound operation (i.e. your coroutine progress is limited by your CPU clocks).

You don't want to have those kinds of sync blocking operations inside the event loop, because they are going to block it (i.e. once that coroutine is selected to run, everything that doesn't have await will necessarily run, so every other coroutine awaiting to run, cannot).

Solutions (in order of what you should do):

  • Turn your get_list into a function.
  • Turn your sync blocking operations into coroutines (this is not something you want to do).

I'm not going to explain how to do implement the above solutions, it's beyond the scope of this issue.

All 3 comments

As I said on #2484, async defare not functions.

What's happening on your get_list_async is that you have an expensive CPU-bound operation (i.e. your coroutine progress is limited by your CPU clocks).

You don't want to have those kinds of sync blocking operations inside the event loop, because they are going to block it (i.e. once that coroutine is selected to run, everything that doesn't have await will necessarily run, so every other coroutine awaiting to run, cannot).

Solutions (in order of what you should do):

  • Turn your get_list into a function.
  • Turn your sync blocking operations into coroutines (this is not something you want to do).

I'm not going to explain how to do implement the above solutions, it's beyond the scope of this issue.

If you have cpu intensive code, running it in a thread or in the event loop is not ideal, no matter what. You probably want to use a separate process for this. There are many options for this. Multiprocessing is one option.

I recently saw another interesting solution, using subprocess to execute python code, which I haven't tried myself, but could be a fun solution: https://til.simonwillison.net/python/subprocess-time-limit

This is considered "normal" when it comes to async def functions, since those will run on the main thread (the event loop) and are not expected to do much work at all before either awaiting something else (which frees the main thread) or completing. This is true even if you run time.sleep(10) (not to be confused with asyncio.sleep(10)) instead of for i in range(100000000).

Keep in mind that the event loop handles the connection management and routing as well, meaning it's never supposed to block for any amount of time, otherwise your application will become unresponsive.

Any operation where the function is expected to be running for any amount of time should be typed as a synchronous def functions, since FastAPI runs those in separate threads.
Blocking operations where the CPU would be idle, like sleeping or network I/O, is not going to cause anymore issues, since nothing is actually happening for most of the runtime of those functions.
CPU-intensive code is more complicated, because two threads can't be running Python code at the same time, meaning they can still block your main event loop thread even if they're not part of that thread. This means you need to either run that code in separate processes instead of threads (which don't share memory) or offload any CPU-intensive operation to specialized libraries like numpy (which usually don't need to access any Python objects for the computation part).

If what you meant is that you need to be executing long-running sync function from within a function that also needs to run async code, FastAPI uses starlette.concurrency.run_in_threadpool to run sync functions inside async contexts, although they have to be functions and are still sunject to the same caveats listed above.

Was this page helpful?
0 / 5 - 0 ratings