aiohttp - concurrent requests are getting hung

Created on 16 Apr 2019  路  10Comments  路  Source: aio-libs/aiohttp

Concurrent requests are getting hung. Here's a sample code that I am using to test concurrent requests.

import aiohttp
import asyncio

async def fetch(session, url):
    async with session.get(url) as response:
        print(await response.text())

async def main(n):
    url = "https://httpstat.us/200"
    async with aiohttp.ClientSession() as session:
        tasks = [asyncio.create_task(fetch(session, url)) for _ in range n]
        await asyncio.gather(*tasks)

asyncio.run(main(10))

When I make 10 concurrent requests, the first 4-5 requests are made concurrently and then it gets hung for over 10 seconds and then starts the remaining tasks which get hung again after running 2-3 concurrent requests. If I make 100 concurrent requests, it makes about 25-30 concurrent requests and gets hung and then makes 5-6 requests and gets hung again, it does this until all tasks are complete.

It's taking over two minutes to make 100 requests to https://httpstat.us/200 with aiohttp.

If I don't use a persistent ClientSession and create new ClientSession for every request, then all hundred requests finish within 5 seconds without getting hung.

I am not sure what I am doing here. Any help will be highly appreciated.

I am running Python 3.7.2 and aiohttp 3.5.4

All 10 comments

First, Task is not required.
Second, I have tried and everything works as a charm without any hangs.

import aiohttp
import asyncio

async def fetch(session):
    async with session.get("https://httpstat.us/200") as response:
        print(await response.text())

async def main(n):
    async with aiohttp.ClientSession() as session:
        await asyncio.gather(*(fetch(session) for _ in range(n)))

asyncio.run(main(100))

I've checked exactly your example and everything is the same: no hangs. Please check your firewall/gateway

Feel free to reopen issue. I'm closing now since can not be reproduced.

Feel free to reopen issue. I'm closing now since can not be reproduced.

@socketpair I've ruled out the possibility of firewall/gateway issues. Moreover, I don't see any hangs if I create a new ClientSession for every request.

With this code, I never see concurrent requests. That is, after 20-30 requests, the program just hangs.

import aiohttp
import asyncio

async def fetch(session):
    async with session.get("https://httpstat.us/200") as response:
        print(await response.text())

async def main(n):
    async with aiohttp.ClientSession() as session:
        await asyncio.gather(*(fetch(session) for _ in range(n)))

asyncio.run(main(100))

Following works like charm. No hangs!!

import aiohttp
import asyncio

async def fetch(session):
    async with aiohttp.ClientSession() as session:
        async with session.get("https://httpstat.us/200") as response:
        print(await response.text())

async def main(n):
    await asyncio.gather(*(fetch() for _ in range(n)))

asyncio.run(main(100))

I am also seeing a frequent SSL errors which I believe is a known bug with aiohttp 3.5.4 on Python 3.7.x. However, I don't think the code hangs because of the SSL errors.

SSL error in data received
handle_traceback: Handle created at (most recent call last):
  File "concurrent_test.py", line 406, in <module>
    asyncio.run(main(), debug=True)
  File "/anaconda3/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/anaconda3/lib/python3.7/asyncio/base_events.py", line 560, in run_until_complete
    self.run_forever()
  File "/anaconda3/lib/python3.7/asyncio/base_events.py", line 528, in run_forever
    self._run_once()
  File "/anaconda3/lib/python3.7/asyncio/base_events.py", line 1756, in _run_once
    handle._run()
  File "/anaconda3/lib/python3.7/asyncio/events.py", line 88, in _run
    self._context.run(self._callback, *self._args)
  File "/anaconda3/lib/python3.7/asyncio/selector_events.py", line 716, in _add_reader
    self._loop._add_reader(fd, callback, *args)
  File "/anaconda3/lib/python3.7/asyncio/selector_events.py", line 260, in _add_reader
    handle = events.Handle(callback, args, self, None)
protocol: <asyncio.sslproto.SSLProtocol object at 0x1080d1e10>
transport: <_SelectorSocketTransport fd=12 read=polling write=<idle, bufsize=0>>
Traceback (most recent call last):
  File "/anaconda3/lib/python3.7/asyncio/sslproto.py", line 526, in data_received
    ssldata, appdata = self._sslpipe.feed_ssldata(data)
  File "/anaconda3/lib/python3.7/asyncio/sslproto.py", line 207, in feed_ssldata
    self._sslobj.unwrap()
  File "/anaconda3/lib/python3.7/ssl.py", line 767, in unwrap
    return self._sslobj.shutdown()
ssl.SSLError: [SSL: KRB5_S_INIT] application data after close notify (_ssl.c:2592)

When I use a persistent session to make concurrent requests, it sometimes hangs and completes after a very long time and sometimes it fails with following traceback. I am not sure why asyncio.TimeoutError is being raised.

Traceback (most recent call last):
  File "concurrent_test.py", line 406, in <module>
    asyncio.run(main(), debug=True)        
  File "/anaconda3/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/anaconda3/lib/python3.7/asyncio/base_events.py", line 573, in run_until_complete
    return future.result()
  File "concurrent_test.py", line 386, in main
    await asyncio.gather(*tasks)
  File "concurrent_test.py", line 248, in fetch
    async with session.post(url, **post_data) as response:
  File "/anaconda3/lib/python3.7/site-packages/aiohttp/client.py", line 1005, in __aenter__
    self._resp = await self._coro
  File "/anaconda3/lib/python3.7/site-packages/aiohttp/client.py", line 575, in _request
    break
  File "/anaconda3/lib/python3.7/site-packages/aiohttp/helpers.py", line 585, in __exit__
    raise asyncio.TimeoutError from None
concurrent.futures._base.TimeoutError

@socketpair looks like this SSL error is a bug that affects aiohttp/asyncio on python3.7. No fix yet.

More info: #3535

Observing same hang issue

I have the same problem, even when opening a new ClientSession:

ssl.SSLError: [SSL: KRB5_S_INIT] application data after close notify (_ssl.c:2592)

If needed, I can provide more information.

Observing the same issue

Having the same issue.

Same issue.

Using one persistent session (ssl=False) for entire script: aiohttp makes 10 concurrent connections, then hangs, then makes random quantity of concurrent requests. Notice the timing:


Image

Screenshot_20200528_215117


26 requests complete in about 5 seconds. Now I do what @raees-khan suggested, creating new session for every request:

Image

Screenshot_20200528_220411

Script finishes in less than a second (4,60s vs 0.60s).

The main problem with creating a single ClientSession for each GET request (as suggested by @raees-khan) is that you lose the ability to control the maximum number of simultaneous connections created by aiohttp.

In my case (Python 3.8.2), a single GET request to https://diego.assencio.com already caused aiohttp to get stuck (HTTPS requests to other URLs worked fine, though). By simply creating a ClientSession object using the default parameters, the requests started succeeding, but a lot slower than usual. To clarify, I changed this:

    async with ClientSession(connector=TCPConnector(limit=16)) as session:
        ...

into this:

    async with ClientSession() as session:
        ...    

and the result was "requests succeeding but slowly", compared to "not a single request succeeding".

Interestingly, the idea from @raees-khan produced the exact same outcome for me: requests started succeeding, but just as slowly as with a single instance of ClientSession created using the default parameters.

For the record, HTTP requests are working fine. All issues I observed happen with HTTPS only.

Was this page helpful?
0 / 5 - 0 ratings