Could not figure out any way to send a continuous stream of data to the client using http. Data size is not known upfront. Are there any recommended approaches?
should be able to continuously call StreamResponse.write
the first write() succeeds, then resp.write_eof() is called, and all subsequent writes raise RuntimeError("Cannot call write() after write_eof()")
async def hello(request):
async def stream_response(response: web.StreamResponse):
for _ in range(100):
b = bytearray('line %d\n' % _, 'utf-8')
await response.write(b)
await asyncio.sleep(1)
response = web.StreamResponse()
response.content_type = 'text/plain'
await response.prepare(request)
streaming_task = asyncio.ensure_future(stream_response(response))
return response
async def hello(request):
response = web.StreamResponse()
response.content_type = 'text/plain'
await response.prepare(request)
for _ in range(100):
b = bytearray('line %d\n' % _, 'utf-8')
await response.write(b)
await asyncio.sleep(1)
return response
@webknjaz I think labels should be only "question" and "server"
Agree. Removed irrelevant labels
I'm not sure it's what the topic starter meant. I think he was asking to put bytes into the TCP stream as you go w/o creating a response with all the data inside before responding.
For example, in CherryPy you can do this by yielding chunks from the handler as opposed to returning the response. Or returning a file-like object: http://docs.cherrypy.org/en/latest/advanced.html#how-streaming-output-works-with-cherrypy
It looks like @jrozentur is asking about a similar feature.
In WSGI you should return immediately either response body or an iterator for body content.
aiohttp doesn't have this limitation but you can send response headers by resp.prepare() call and after that write response chunks one by one. Pausing and context switching between chunk sending await do_something() is completely correct.
Does await response.write(b) send bytes over the network immediately?
@webknjaz yes, similar to cherrypy link you have provided. Is something like this available already in aiohttp?
@jrozentur see example at https://github.com/aio-libs/aiohttp/issues/3952#issuecomment-517206512 it will not buffer all data and will send by chunks as soon as possible.
I've checked that example and it works. Probably your browser buffers the payload. Try checking with curl.