Fastapi: [BUG] no asynchronous file-write operations possible, when opening subprocess

Created on 18 Jan 2020  路  4Comments  路  Source: tiangolo/fastapi

Describe the bug

As soon as a subprocess is started, it should write asyncron outputs directly into a file. This means that the progress of a subprocess is also visible while it is being processed.

But as soon as such a functionality is processed with fastapi uvicorn, the output is not written until the process is finished.

Is there a way to fix this?

Write here a clear and concise description of what the bug is.

To Reproduce

Steps to reproduce the behavior with a minimum self-contained file.

Replace each part with your own scenario:

  1. Create a file with:

```import subprocess
from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def read_root():
with open("ouput.log", 'a+') as f:
subprocess.Popen("python scrap_test.py",
stdout=f,
universal_newlines=True)


2. Next create a scrap_test.py file in the same folder:

from time import sleep

for i in range(360):
sleep(1)
print(i)


3. Open the browser and call the endpoint `/`.
4. Not the created output.log file should be continuously written to, but the content is only written to the file once.

### Expected behavior

The output.log should continuously grow, but it isn't growing.

### Environment

- OS: Windows
- FastAPI Version: 0.45.0

```bash
python -c "import fastapi; print(fastapi.__version__)"
  • Python version: 3.7.3:
python --version

Additional context

Starting the same subprocess functionality only with python does the correct thing.

bug

Most helpful comment

I am not sure if it is related, but a number of issues have been reported related to the use of subprocesses. I suspect the issue is related to something lower level than fastapi, but I'm not sure whether that's starlette, uvicorn, or something else.


For the time being, if getting this to work reliably is critical, I would strongly recommend using a dedicated worker service via something like celery or arq over trying to get subprocess calls to work inside an endpoint.

If you do figure it out, please post an update here!

All 4 comments

I am not sure if it is related, but a number of issues have been reported related to the use of subprocesses. I suspect the issue is related to something lower level than fastapi, but I'm not sure whether that's starlette, uvicorn, or something else.


For the time being, if getting this to work reliably is critical, I would strongly recommend using a dedicated worker service via something like celery or arq over trying to get subprocess calls to work inside an endpoint.

If you do figure it out, please post an update here!

Is it also possible to execute command line commands with these dedicated workers?

I agree that if it's something critical or heavy, using something like Celery or ARQ would probably be a better option.

Without a complete example it's difficult to see what else could be going on. But maybe you could try forcing flushes from time to time, as writing to streams (files or stdout) is done with buffers.

Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

danielgtaylor picture danielgtaylor  路  3Comments

kkinder picture kkinder  路  3Comments

Wytamma picture Wytamma  路  3Comments

tsdmrfth picture tsdmrfth  路  3Comments

updatatoday picture updatatoday  路  3Comments