Fastapi: [QUESTION] Strategies for limiting upload file size

Created on 2 Jul 2019  路  7Comments  路  Source: tiangolo/fastapi

Description

I'm trying to create an upload endpoint. I want to limit the maximum size that can be uploaded.

My endpoint looks like this:

@app.post('/upload', response_model=UploadedFileDTO)
async def upload_file(file: UploadFile = File(...), db: Session = Depends(get_db_session)):
    save_path = local.generate_path(file.filename)
    with file.file as f:
        local.save(stream=f, save_path=save_path)

    u = Upload(filename=file.filename,
               path=str(save_path))
    db.add(u)
    db.commit()

    return u

I checked out the source for fastapi.params.File, but it doesn't seem to add anything over fastapi.params.Form.

The only solution that came to my mind is to start saving the uploaded file in chunks, and when the read size exceeds the limit, raise an exception. But I'm wondering if there are any idiomatic ways of handling such scenarios?

question

Most helpful comment

Thanks everyone for the discussion here!

So, here's the thing, a file is not completely sent to the server and received by your FastAPI app before the code in the path operation starts to execute.

So, you don't really have an actual way of knowing the actual size of the file before reading it.


You could require the Content-Length header and check it and make sure that it's a valid value. E.g.

from fastapi import FastAPI, File, Header, Depends, UploadFile


async def valid_content_length(content_length: int = Header(..., lt=50_000_000)):
    return content_length


app = FastAPI()

@app.post('/upload', dependencies=[Depends(valid_content_length)])
async def upload_file(file: UploadFile = File(...)):
    # do something with file
    return {"ok": True}

And then you could re-use that valid_content_length dependency in other places if you need to.

:warning: but it probably won't prevent an attacker from sending a valid Content-Length header and a body bigger than what your app can take :warning:


Another option would be to, on top of the header, read the data in chunks. And once it's bigger than a certain size, throw an error.

E.g.

from typing import IO

from tempfile import NamedTemporaryFile
import shutil

from fastapi import FastAPI, File, Header, Depends, UploadFile, HTTPException
from starlette import status


async def valid_content_length(content_length: int = Header(..., lt=80_000)):
    return content_length


app = FastAPI()


@app.post("/upload")
def upload_file(
    file: UploadFile = File(...), file_size: int = Depends(valid_content_length)
):
    real_file_size = 0
    temp: IO = NamedTemporaryFile(delete=False)
    for chunk in file.file:
        real_file_size += len(chunk)
        if real_file_size > file_size:
            raise HTTPException(
                status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE, detail="Too large"
            )
        temp.write(chunk)
    temp.close()
    shutil.move(temp.name, "/tmp/some_final_destiny_file")
    return {"ok": True}

All 7 comments

Ok, I've found an acceptable solution. But it relies on Content-Length header being present.

Edit: I've added a check to reject requests without Content-Length

from starlette import status
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
from starlette.requests import Request
from starlette.responses import Response
from starlette.types import ASGIApp


class LimitUploadSize(BaseHTTPMiddleware):
    def __init__(self, app: ASGIApp, max_upload_size: int) -> None:
        super().__init__(app)
        self.max_upload_size = max_upload_size

    async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
        if request.method == 'POST':
            if 'content-length' not in request.headers:
                return Response(status_code=status.HTTP_411_LENGTH_REQUIRED)
            content_length = int(request.headers['content-length'])
            if content_length > self.max_upload_size:
                return Response(status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE)
        return await call_next(request)

using it is quite straightforward:

app = FastAPI()
app.add_middleware(LimitUploadSize, max_upload_size=50_000_000)  # ~50MB

The server sends HTTP 413 response when the upload size is too large, but I'm not sure how to handle if there's no Content-Length header.
Edit: Solution: Send 411 response

The server sends HTTP 413 response when the upload size is too large, but I'm not sure how to handle if there's no Content-Length header.

You can reply HTTP 411 if Content-Length is absent.

@tiangolo This would be a great addition to the base package

For what it's worth, both nginx and traefik have lots of functionality related to request buffering and limiting maximum request size, so you shouldn't need to handle this via FastAPI in production, if that's the concern.

You can use an ASGI middleware to limit the body size.

Example: https://github.com/steinnes/content-size-limit-asgi

Thanks everyone for the discussion here!

So, here's the thing, a file is not completely sent to the server and received by your FastAPI app before the code in the path operation starts to execute.

So, you don't really have an actual way of knowing the actual size of the file before reading it.


You could require the Content-Length header and check it and make sure that it's a valid value. E.g.

from fastapi import FastAPI, File, Header, Depends, UploadFile


async def valid_content_length(content_length: int = Header(..., lt=50_000_000)):
    return content_length


app = FastAPI()

@app.post('/upload', dependencies=[Depends(valid_content_length)])
async def upload_file(file: UploadFile = File(...)):
    # do something with file
    return {"ok": True}

And then you could re-use that valid_content_length dependency in other places if you need to.

:warning: but it probably won't prevent an attacker from sending a valid Content-Length header and a body bigger than what your app can take :warning:


Another option would be to, on top of the header, read the data in chunks. And once it's bigger than a certain size, throw an error.

E.g.

from typing import IO

from tempfile import NamedTemporaryFile
import shutil

from fastapi import FastAPI, File, Header, Depends, UploadFile, HTTPException
from starlette import status


async def valid_content_length(content_length: int = Header(..., lt=80_000)):
    return content_length


app = FastAPI()


@app.post("/upload")
def upload_file(
    file: UploadFile = File(...), file_size: int = Depends(valid_content_length)
):
    real_file_size = 0
    temp: IO = NamedTemporaryFile(delete=False)
    for chunk in file.file:
        real_file_size += len(chunk)
        if real_file_size > file_size:
            raise HTTPException(
                status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE, detail="Too large"
            )
        temp.write(chunk)
    temp.close()
    shutil.move(temp.name, "/tmp/some_final_destiny_file")
    return {"ok": True}

Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

vnwarrior picture vnwarrior  路  3Comments

laith43d picture laith43d  路  3Comments

PyDataBlog picture PyDataBlog  路  3Comments

kkinder picture kkinder  路  3Comments

zero0nee picture zero0nee  路  3Comments