Fastapi: [BUG] FastAPI gets terminated when child multiprocessing process terminated

Created on 27 May 2020  路  11Comments  路  Source: tiangolo/fastapi

Describe the bug

Make a multiprocessing Process and start it.
Right after terminate the process, fastapi itself(parent) terminated.

To Reproduce

Start command: /usr/local/bin/uvicorn worker.stts_api:app --host 127.0.0.1 --port 8445

  1. Create a file with:
from fastapi import FastAPI

app = FastAPI()


@app.post('/task/run')
def task_run(task_config: TaskOptionBody):
    proc = multiprocessing.Process(
        target=task.run,
        args=(xxxx,))
    proc.start()
    return task_id

@app.get('/task/abort')
def task_abort(task_id: str):
    proc.terminate()
    return result_OK
  1. Run task_run and while the process alive, trigger task_abort
  2. After child process terminated then parent(fastApi) terminated as well.

Expected behavior

Parent process should not be terminated after child terminated.

Environment

  • OS: Linux
  • FastAPI Version 0.54.1
  • Python version 3.8.2

Additional context

I tried same code with Flask with gunicorn, it never terminated.

bug

Most helpful comment

Hi, I have discovered this situation and came to the next conclusions:
1) Uvicorn register signals handlers and child process inherit them (but also inherit ThreadPoolExecutor and another resources)
2) You cannot set signals handlers not from the main thread
3) The task function will be executed in the ThreadPoolExecutor, so as I say early - you cannot change signal handlers in this function;

But it still possible to solve this problem (without changing FastAPI or uvicorn) - you can change start_method for multiprocessing to spawn method and your child process will be clear (without inherited signals handles, thread pools and other stuff).

@app.post('/task/run')
def task_run():
    multiprocessing.set_start_method('spawn')
    proc = multiprocessing.Process(
        target=task,
        args=(10,))
    proc.start()

    return proc.pid

It's work for me (python3.7, macOS 10.15.5)

All 11 comments

Hi @jongwon-yi,
Just wanted to ask what 'TaskOptionBody' is referring to in the definition of task_run

Hi @jongwon-yi,
Just wanted to ask what 'TaskOptionBody' is referring to in the definition of task_run

class TaskOptionBody(BaseModel): owner: str description: str subscribers: str devices: list options: dict protocol: int

This behavior doesn't related to request headers or payloads. You can simply reproduce this by

run fastapi with uvicorn
start default(fork) multiprocessing Process and save the proc somewhere
terminate child with saved proc (proc.terminate())

@Kludex May I work on this?

@victorphoenix3 I'm not in charge of anything hahaha If I were you, I'd wait for someone else to confirm the bug (you can confirm by yourself), then if it's really a problem, you can work on it. There are no PRs related to this issue. :)

@tiangolo @Kludex I did not find this issue on my system. Terminating the child process did not terminate the parent. Can you please re-confirm?
This is the code I used:

from fastapi import FastAPI
from pydantic import BaseModel
import multiprocessing
import os, signal
import psutil

app = FastAPI()

class TaskOptionBody(BaseModel): 
    owner: str 
    description: str 
    subscribers: str 
    devices: list 
    options: dict 
    protocol: int

def task(pid :int):
    print(f"{pid} {os.getpid()}")

@app.post('/task/run')
def task_run(task_config: TaskOptionBody, pid: int):
    proc = multiprocessing.Process(
        target=task,
        args=(pid,))
    proc.start()
    return os.getpid()

@app.get('/task/abort')
def task_abort(pid: int):
    proc = psutil.Process(pid)
    proc.terminate()
    return 0

Oh, that's amazing, thanks a lot for taking the time to debug and try to reproduce it @victorphoenix3 ! :clap: :bow: That helps a lot! :cake:

I tried it locally and wasn't able to reproduce it either. @jongwon-yi please check with @victorphoenix3's example.

@victorphoenix3
It seems like your process need to have some long running code.
Just try to add "time.sleep(30)" and try to abort this within the time.

I tried your code and there is no issue. (Because the subprocess already terminated..?)
1 21742
INFO: 127.0.0.1:52778 - "POST /task/run HTTP/1.1" 200 OK
INFO: 127.0.0.1:52780 - "GET /task/abort?pid=21742 HTTP/1.1" 200 OK

But after I adding "sleep 30 seconds", and the issue comes.
1 21982
INFO: 127.0.0.1:52802 - "POST /task/run HTTP/1.1" 200 OK
INFO: 127.0.0.1:52804 - "GET /task/abort?pid=21982 HTTP/1.1" 200 OK
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [21973]

With the sleep I was able to reproduce it as well. jfyk

@Kludex can you tell me what am doing wrong? still haven't been able to reproduce it.
Here's how i modified the child process to add sleep and aborted it before it printed the child process complete message.

def task(pid :int):
    print(f"{pid} {os.getpid()}")
    time.sleep(50)
    print("child process complete")

and it terminates without shutting fastapi

INFO:     127.0.0.1:40570 - "POST /task/run?pid=0 HTTP/1.1" 200 OK
0 10409
INFO:     127.0.0.1:40578 - "GET /task/abort?pid=10409 HTTP/1.1" 200 OK

You're doing the same thing as us, but it works just fine for you. I'll paste here my configs and python packages/version later.

Hi, I have discovered this situation and came to the next conclusions:
1) Uvicorn register signals handlers and child process inherit them (but also inherit ThreadPoolExecutor and another resources)
2) You cannot set signals handlers not from the main thread
3) The task function will be executed in the ThreadPoolExecutor, so as I say early - you cannot change signal handlers in this function;

But it still possible to solve this problem (without changing FastAPI or uvicorn) - you can change start_method for multiprocessing to spawn method and your child process will be clear (without inherited signals handles, thread pools and other stuff).

@app.post('/task/run')
def task_run():
    multiprocessing.set_start_method('spawn')
    proc = multiprocessing.Process(
        target=task,
        args=(10,))
    proc.start()

    return proc.pid

It's work for me (python3.7, macOS 10.15.5)

Was this page helpful?
0 / 5 - 0 ratings