Sanic: using same middleware for request and response

Created on 28 Oct 2019  路  7Comments  路  Source: sanic-org/sanic

Aiohttp and Starlette frameworks use same function as request and response middleware.

Examples:
https://aiohttp.readthedocs.io/en/stable/web_advanced.html#middlewares
https://www.starlette.io/middleware/

I think, that that behaviour is more short and explicit, than using separate functions.
Sanic has same feature - handler decorators:
https://sanic.readthedocs.io/en/latest/sanic/decorators.html

But they should wrap each handler separately, while middleware is global for all app.

  • are there any benefits in using separate functions for request and response
  • would it be better to have such way middleware handling in Sanic instead of current ?
    (maybe adopt handler decorators to be used globally for all app)

Wrote this to discuss, maybe to add this to roadmap if it worth.
Thanks.

stale

Most helpful comment

@harshanarayana 's workaround doesn't quite address the same issue, which is that the middleware would call and await for inner parts of the chain. That allows carrying state from request middleware to response middleware in local variables.

I am not sure whether @vlad0337187 's suggestion should be implemented or not. There are cases where it can be useful, but then it also requires different architecture than the current middleware loops.

Pros:

  • Similar to RAII idiom, simple and correctly ordered coupling for things that require "construct" and "deconstruct" steps
  • Locally held state allows things like wrapping further processing (next handler) into a with block
  • Explicitly call "next handler" to continue processing (normal program flow, no return value magic)

    • Full control over request and response objects, can re-route, call multiple handlers, ...

    • Possibly even more explicit logic required for streaming responses

Cons:

  • Unnecessarily harder when you only need request or response middleware
  • Middlewares would require even more code to handle different phases of response (headers/streaming/after)
  • Implementing both modes of operation, to keep backwards compatibility, would be even messier

All 7 comments

@vlad0337187 You can use a single middleware for both request and response in sanic as well. Nothing stopping you from doing that. Just that the suggsted way is to use two different items.

from sanic import Sanic
from sanic.response import json

app = Sanic(__name__)


@app.middleware('response')
@app.middleware('request')
async def generic_middleware(request, response=None):
    if not response:
        print("request")
        request.headers["Server"] = "Fake-Server"
    else:
        print("response")
        response.headers["x-xss-protection"] = "1; mode=block"



@app.get("/")
async def index(request):
    print(request.headers)
    return json({
        "msg": "Hello"
    })


app.run(port=7654, host="0.0.0.0")
> http http://0.0.0.0:7654
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 15
Content-Type: application/json
Keep-Alive: 5
x-xss-protection: 1; mode=block

{
    "msg": "Hello"
}
[2019-11-08 00:11:01 +0530] [95445] [INFO] Goin' Fast @ http://0.0.0.0:7654
[2019-11-08 00:11:01 +0530] [95445] [INFO] Starting worker [95445]
request
<Header('host': '0.0.0.0:7654', 'connection': 'keep-alive', 'accept-encoding': 'gzip, deflate', 'accept': '*/*', 'user-agent': 'HTTPie/0.9.9', 'Server': 'Fake-Server')>
response

@harshanarayana 's workaround doesn't quite address the same issue, which is that the middleware would call and await for inner parts of the chain. That allows carrying state from request middleware to response middleware in local variables.

I am not sure whether @vlad0337187 's suggestion should be implemented or not. There are cases where it can be useful, but then it also requires different architecture than the current middleware loops.

Pros:

  • Similar to RAII idiom, simple and correctly ordered coupling for things that require "construct" and "deconstruct" steps
  • Locally held state allows things like wrapping further processing (next handler) into a with block
  • Explicitly call "next handler" to continue processing (normal program flow, no return value magic)

    • Full control over request and response objects, can re-route, call multiple handlers, ...

    • Possibly even more explicit logic required for streaming responses

Cons:

  • Unnecessarily harder when you only need request or response middleware
  • Middlewares would require even more code to handle different phases of response (headers/streaming/after)
  • Implementing both modes of operation, to keep backwards compatibility, would be even messier

@Tronic nice layout of pros and cons. I am not sure that your pro list outweighs the cons, especially given the potential change that this would require. The with block idea does seem useful, but I'm not sure such utility outweighs they seemingly larger use case where having a separation leads to lower complexity.

Personally, I am not in favor. Other thoughts?

I'm inclined to say pull requests welcome because an actual implementation would show much better whether the feature is worth while. FWIW, a simple wrapper function can provide compatibility for existing middleware, if only the suggested API is implemented natively, and wrapping can be done transparently by the middleware registration API.

Thank you.
Just wanted to clarify this.
Will look to the code some time to check implementation of this in Aiohttp and Sanic.

NodeJS Express also uses the next function in its middleware, which might be of concern given that Express is probably the most popular async web framework in the world.

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is incorrect, please respond with an update. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings