pip doesn't respect proxy while installing packages from requirements file

Created on 8 May 2014  Â·  86Comments  Â·  Source: pypa/pip

I'm using pip 1.5.5 on python 2.7.6, CentOS 6.5 x86_64. Python is compiled and installed in /usr/local/. While trying to install packages from requirements file, the first 12 packages are being downloaded via proxy, but after that pip tries to directly connect to the server instead of proxy and request fails.

$ export https_proxy=http://proxy:8080
$ /usr/local/bin/pip2.7 install -r reqs.txt

This results in download of first 12 packages via proxy, but the next one is a direct HTTPS request to the pypi server. I've validated this via packet trace and proxy logs.

Note that, the installation works flawlessly using /usr/bin/pip, v1.3.1.

Thoughts?

proxy auto-locked

Most helpful comment

I run
pip install -r requirements.txt --proxy=<proxy address>

my requirements.txt contains 20 entries, it installs the first 16 without any problems and then throws the error:
Timeout: (<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x1f70650>, 'Connection to pypi.python.org timed out. (connect timeout=15)')

Interestingly enough it always fails when installing the 16. package (django-session-csrf==0.5)package but if I install the package manually using pip (not via the requirements.txt file) it works fine. If I run pip install -r requirements.txt --proxy=<proxy address> again after that it obviously skips django-session-csrf but then installs the following package without problems and then fails with the Timeout error on the package after that.

Increasing the timeout to something like 300 doesn't help.

All 86 comments

Is the one that is a direct HTTPS request to PyPI in a setup_requires somewhere?

No, I don't think so. I checked in /tmp/pip_built_root directory. grep'd for 'setup_requires' in all setup.py/cfg files. Nothing. The installation failed as expected.

Let me know if there is a different way to verify it.

Is it a particular package that is reaching out to PyPI?

Here are the contents of reqs.pip

requests==1.2.3
Flask==0.10.1
Flask-Login==0.2.7
Flask-Security==1.6.9
Flask-SQLAlchemy==1.0
Flask-Script==0.6.7
Flask-Principal==0.4.0
Flask-WTF==0.9.1
boto==2.3.0
suds==0.4
xmltodict==0.8.6
simplejson==3.3.3 
stripe==1.12.2
raven==4.1.1
Flask-Migrate==1.2.0
alembic==0.6.3

Installation fails at stripe. If I comment simplejson, it fails at raven. Here is the failure log. Note that, timeout is due to firewall dropping packets directly being sent to pypi server.

Downloading/unpacking stripe==1.12.2 (from -r /var/www/html/backend/reqs.pip (line 21))
Cleaning up...
Exception:
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/basecommand.py", line 122, in main
    status = self.run(options, args)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/commands/install.py", line 278, in run
    requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/req.py", line 1197, in prepare_files
    do_download,
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/req.py", line 1375, in unpack_url
    self.session,
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/download.py", line 546, in unpack_http_url
    resp = session.get(target_url, stream=True)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/_vendor/requests/sessions.py", line 395, in get
    return self.request('GET', url, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/download.py", line 237, in request
    return super(PipSession, self).request(method, url, *args, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/_vendor/requests/sessions.py", line 383, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/_vendor/requests/sessions.py", line 486, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/pip-1.5.5-py2.7.egg/pip/_vendor/requests/adapters.py", line 387, in send
    raise Timeout(e)
Timeout: (<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x1f70650>, 'Connection to pypi.python.org timed out. (connect timeout=15)')

I'm having the same error. Strangely enough it installs a bunch of packages from the requirements file without problems but then fails with the VerifiedHTTPSConnection Timeout error.

I couldn't find a solution for this. As a workaround, I used proxychains to enforce transparent proxy.

It would be awesome if we could get a simplest set of steps to reproduce this

I run
pip install -r requirements.txt --proxy=<proxy address>

my requirements.txt contains 20 entries, it installs the first 16 without any problems and then throws the error:
Timeout: (<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x1f70650>, 'Connection to pypi.python.org timed out. (connect timeout=15)')

Interestingly enough it always fails when installing the 16. package (django-session-csrf==0.5)package but if I install the package manually using pip (not via the requirements.txt file) it works fine. If I run pip install -r requirements.txt --proxy=<proxy address> again after that it obviously skips django-session-csrf but then installs the following package without problems and then fails with the Timeout error on the package after that.

Increasing the timeout to something like 300 doesn't help.

I'm facing same issue on CentOS 6.4 with Python 2.6.6 and pip 1.5.6. Any idea how to get a workaround?

Seems like it is always the 15th item in the requirements file. If I comment out earlier items, it times out on a later item than previously...

Tried with pip 1.5.4 and 1.5.6 on Ubuntu 14.04 LTS

@pcraston could it be some sort of rate limiting on your proxy? With the requirements file pip is sending too many requests in a short period of time?

Just spoke to our IT manager he says there is no rate limiting on our proxy. But it does seem odd!

Its doesn't seem to be enforced by proxy. If you monitor the outgoing
traffic, you'll see that for some reason pip starts sending direct requests
to the web server. Hence, the requests either time out or are
refused(depending on network setup).

I can confirm I have the same issue. Installing threatstream/mhn with the requirements file.

This appears to be a requests bug, I've redirected this to upstream here: https://github.com/kennethreitz/requests/issues/2110

Or perhaps urllib3.

Ok requests has kicked this back to us, it appears it might actually be a PyPI/Fastly issue.

Can folks getting this issue run this script with requests 2.3.0?

import requests
session = requests.Session()
for i in range(100):
    print(i)
    session.get("https://pypi.python.org/simple/").content

and report back if it fails and where it fails?

Also if you can report back with the name of your proxy software and version or any other information of the like that we can use to try and reproduce.

Also if you can go to http://debug.fastly.com/ and post the encrypted block at the top so I can forward it to Fastly that would be great as well.

Oh, if the above script fails, can you also try it with https://imgur.com/ and https://google.com/

Also can you clarify that there is no _address_ based restriction on rate limiting or the like? For example the IP addresses associated with PyPI are also associated with a lot of other sites that Fastly hosts like imgur so could the proxies be denying based on the IP addresses?

So I ran the little script and after 20sec of displaying content I get the following error :

51
Traceback (most recent call last):
File "", line 3, in
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 468, in >get
return self.request('GET', url, _kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in >request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, *
kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send
raise ProxyError(e)
requests.exceptions.ProxyError: ('Cannot connect to proxy.', error(110, 'Connection timed out'))

Our corporate proxy doesn't have rate limiting or address based restrictions.

Just like above, the script failed on 52nd requests for both pypi and imgur. Google worked fine.

49
50
51
Traceback (most recent call last):
File "/tmp/pypitest.py", line 5, in
session.get("https://pypi.python.org/simple/").content
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 468, in get
return self.request('GET', url, _kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, *
kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send
raise ProxyError(e)
requests.exceptions.ProxyError: ('Cannot connect to proxy.', error(111, 'Connection refused'))

The proxy is configured to refuse in case a direct connection is attempted, hence 111. There is no rate limiting AFAIK. Furthermore, I can confirm that I've faced the same situation on two different proxy deployments - both squid, running version 3.1.10 on centos.

Response from debug.fastly.com

H4sIAAAAAAAAA31SXU/bMBT9K1GeNlan+WxTKqRloYVsNFSkICZVQm7iJhaJ3dkOTUH8912HaRVi
muQH+55zj6+Pz4tZEVwQIc3TF/OSS2WemgXZtKW1xVLVByvnjTkwbyURKCoJ0/iCP9O6xuthYNnG
p3vHmRq3m5apdmpcUdZ2RheOHkb+1BBPp+7Esj8bFyR/5Ouhazs2LMeYU0G2vIMKwCAf5TnZaWlF
OrUeVqqpB3i3q2mOFeVsPex06Uv3odzU019ntjUZnKyHJ/02/CuHrjArW1wS0CUM3WYDwnpKAJQY
5xVBMWdK8BoIDe4QUM/cYOLax5HQjOW8oKwESvlMdwDMe19QXFNwAyVLQBzfcryJFbrWCAj3aEUb
IqCeOb7tjcd+6DmWB8JOMBkHg7vMPspcEJ4sYQ6wTxxiXuhhk/SIZ0Q8gfVv19ie5fqB5bqu5YT+
kbQizQ7dz+f/GOUOC0ZlpZHQD7xx6HhBD8y52GNRkELvPjT+Ee4l894q+TRyUfY99szXgbmvOJUP
jKg9F486ORT2rG3e6dgGMo4nzwFZhhv9wHQxS+IVSpJkhS5/ngMgFVYtZNCMsiy5SGfnRnqdouX1
zSr6djXTnVzR7UF/1GYnsMTFV0qpsnBuUQZwA1+x0fAiStIVuskWUZqg9EeKejM1XPN9/ylvjHeQ
4K0i8r/dVOh4JjfQ+WH4vIKgkQJwCLiPbBe52mPJW5Hr50bLNImPtvE904O8mFyUmNHnPstanCki
WH/CtZEwqaiCuQy+hcOWi6aHjBXJK8ZrXh7gDlwUgkj5FprX19+sUp+UzgMAAA==

It also fails on 52nd request for me on https://pypi.python.org/simple/ and https://imgur.com/ (but with a different error than @shredder12 and @nullprobe -- see below):

50
51

ProxyError                                Traceback (most recent call last)
<ipython-input-21-788947201a19> in <module>()
      1 for i in range(100):
      2     print (i)
----> 3     session.get("https://pypi.python.org/simple/").content
      4 

/home/vagrant/venv/local/lib/python2.7/site-packages/requests/sessions.pyc in get(self, url, **kwargs)
    393 
    394         kwargs.setdefault('allow_redirects', True)
--> 395         return self.request('GET', url, **kwargs)
    396 
    397     def options(self, url, **kwargs):

/home/vagrant/venv/local/lib/python2.7/site-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert)
    381             'allow_redirects': allow_redirects,
    382         }
--> 383         resp = self.send(prep, **send_kwargs)
    384 
    385         return resp

/home/vagrant/venv/local/lib/python2.7/site-packages/requests/sessions.pyc in send(self, request, **kwargs)
    484         start = datetime.utcnow()
    485         # Send the request
--> 486         r = adapter.send(request, **kwargs)
    487         # Total elapsed time of the request (approximately)
    488         r.elapsed = datetime.utcnow() - start

/home/vagrant/venv/local/lib/python2.7/site-packages/requests/adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies)
    379 
    380         except _ProxyError as e:
--> 381             raise ProxyError(e)
    382 
    383         except (_SSLError, _HTTPError) as e:

ProxyError: Cannot connect to proxy. Socket error: [Errno 110] Connection timed out.

All works fine when sending requests to https://google.com

Proxy is squid version 3.1.20 running on Debian

http://debug.fastly.com/ output:

H4sIAAAAAAAAA21SXW/aMBT9K1b20nY4xIFAaNUHlgKL1AZE0q8JaXIdN1hLbOo4BVr1v+863cam
VcqDc8+5536dV2fNac517Zy+Ol9VbZxTJ+cPTeE+0tqUe5epyuk41zXXeFxwafEr9SLKkq66geuh
o1shc7WtUZKhgUvO0O38dtA/Q/r5tOe53jGacfZDrbq+Rzz4CJoKzR/VbtW1MEiPGeMbK2v4zqy6
a1OVHbrZlIJRI5RcdXc29Hn3X7gqz57OPXfUOVl1T9pn+EcOX1JZNLTgoMslLh46XLaUACg3gkKY
uASVitFyDVOjo/qpETk05RLX946BFVG25jhS0mhVAr+iOwyC534w8r1D43gimcqFLIBSvIgNANN2
czgqBewLxwtAQt8dQsEQNjQYAuUOZ6LiGpCU9L0+6ZF+ELohGfaG4WgUdm5S7yA04ypeQCcNtLKP
VG6Hmn054CnXz3CethAJA7dnK8EXHCgZrzb4bjr9sJUbqqWo1zabDEkAd/IGLTBVekt1znP7+iD1
l3gry9p1lWxPiB/gy+jeees427US9XfJzVbpH9ZhAt6yqf7V8kOE0V//IwLaklZ2zm+TpHXOkPTw
xXiJo3mSTe6yOJnOl1fjLJ4n6SS6XsbZPeSw9xX9Xk9tqGnA2M44TeNZMrlAizGEKzjKw/5dGyfz
CF8lmWWrRjNbchkvJugT+LQ0XPP8MIfaSnuyV0fpgkrx0hoREqxHwLsolo9KV20UpZw1Wpg9ujSg
8PYTBBarvWcDAAA=

Fails on 52 attempt. As above works fine with https://google.com.

Running Ubuntu 12.04 running in host-only mode on VirtualBox, proxying through wingate on Windows 8.1 host.

Traceback (most recent call last):
File "/tmp/test.py", line 5, in
session.get("https://pypi.python.org/simple/").content
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 468, in get
return self.request('GET', url, _kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, *
kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send
raise ProxyError(e)
requests.exceptions.ProxyError: ('Cannot connect to proxy.', error(101, 'Network is unreachable'))

H4sIAAAAAAAAA21T70/bMBD9V6x8AhanSZ2kBMSHLsCK+FXRQpHWaXKTI7Wa2Jnjri2o//vOKaOT
hqxI8Xu+d8935zdnDjwH3Tgnb85ANcY5cXKYLQvvhTem3HiZqhzX6WcZ1JYzsDbTztxUpcvruhQZ
N0LJaWdtoS/r/+CqPP115nuJKypewLSzglntHk07Ry18jNKPDWjaL0Ba+Vv1KsqSTzuR55ODiZC5
WjXkbkxij52Syf0kDg9JHzPABGbXAq1ErOexmBxcD8a3Ny4pxQLIN8gW6pCkc60qzMlQzAuSIPaC
iJERf+Fa/A38uBu94bJYokd0AZI+jlyQHx5Tns2BpkoarUo8UPE1bQwv4cxH9pl+LZeQKm7ok+A2
Pghj1gNgAc9mLMn3SS5kpnIhCzxUvIoaicu2zDQtBVaAXg2RCSKv6zOvy5jXi1v9sahAIzMKQj8M
wpDFzEvCOOyGSRy4TyN/L/QN1NUQrS7R6yZVub3P42jPj0D/xoK3ibo2SeSFWJ3jcH9kDFVNny8v
P7XyxLUUzRy5417SQydR1OKXSq+4ziG3f59Evmu3qllbTqXzoMe69P7h3Nm6zmquRPNTglkpvbDT
KPBfLqudlm8XocTKRh8f6kpe2SsOhvTqbnzxcHcxRnDOZV5aGLc0iKjfrgAZ7JpZ4rA750JDZki/
aUQhKzt9rpNp4AZymzFJQur3qG9jMpQrWrjrBwgz6kd7x2olbW/eHKULLsVrO/nWEaxKMIYOebbA
wpBUVTWXm3/dDYa443muoWnfX2M0AD6D7w7zfZ8MuFTYKzLawT/QiTAbjBvyUpF+adT7haxW2sdN
ne0anoTMtw3NdmOwG4HtdvsHvEQyo+4DAAA=

Hi,
I've encountered the same problem in my Vagrant environment, running CentOS 6.5 and pip 1.5.6 with our corporate proxy.

This seems to happen randomly on big requirements.txt that I try to install with pip when using the https proxy.

After investigation, I found that the problem seems to lie in the PoolManager of urllib3 that is embedded in the pip installation (_vendor/requests/packages/urllib3/poolmanager.py:ProxyManager)

I found a quick fix in the code that make my installation passes correctly.

To confirm what I'm saying, just change this single line in the following file of your pip installation:

$ diff /opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/adapters.patch.py /opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/adapters.py
209c209
<             if True or not proxy in self.proxy_manager:
---
>             if not proxy in self.proxy_manager:

The pip install works correctly afterward when instantiating a new ProxyManager object every time.

Here is an example of error that I used to have for my pip install:

$ pip install -r requirements.txt
....
Downloading/unpacking pecan>=0.4.5 (from -r requirements.txt (line 25))
Cleaning up...
Exception:
Traceback (most recent call last):
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/basecommand.py", line 122, in main
    status = self.run(options, args)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/commands/install.py", line 278, in run
    requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/req.py", line 1197, in prepare_files
    do_download,
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/req.py", line 1375, in unpack_url
    self.session,
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/download.py", line 546, in unpack_http_url
    resp = session.get(target_url, stream=True)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/sessions.py", line 468, in get
    return self.request('GET', url, **kwargs)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/download.py", line 237, in request
    return super(PipSession, self).request(method, url, *args, **kwargs)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/sessions.py", line 456, in request
    resp = self.send(prep, **send_kwargs)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/sessions.py", line 559, in send
    r = adapter.send(request, **kwargs)
  File "/opt/venvs/ironic/lib/python2.6/site-packages/pip/_vendor/requests/adapters.py", line 384, in send
    raise Timeout(e, request=request)
Timeout: (<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x2802e50>, 'Connection to pypi.python.org timed out. (connect timeout=15)')

Storing debug log for failure in /root/.pip/pip.log

For the curious, the requirements.txt file I was testing is this one:
https://github.com/openstack/ironic/blob/master/requirements.txt

Hey @alexandrem I'm not sure I'm groking what you've actually changed. Was it in pip/_vendor/requests/adapters.py? Or a different file?

Yes my test change was in pip/_vendor/requests/adapters.py

Sorry the diff was reverse, but basically I just add a True to the condition on line 209 of the adapter.py to always create a ProxyManager instance, thus skipping the pool manager logic.

If everyone who has run @dstufft's script above can try out the following script, it would help us narrow down the problem. First, if you can determine which proxy requests is utilizing, grab that

from requests.packages.urllib3 import ProxyManager

proxy_url = 'http://localhost:8080'  # Just an example, place your proxy URL here please

pool = ProxyManager(proxy_url)
for i in range(100):
    print(i)
    pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*'}).read()

This is roughly the equivalent to what requests sends.

Also, if all of you could make a simple get (with requests) to https://httpbin.org/get and post the result of calling the json method on the response, I'd be interested to see if the request once through the proxy differs from what we might expect to see in anyway.

I'm encountering this issue with pip 1.5.5 or great, and even with requests 2.2.1. Pinning the pip version to 1.5.4 resolves the issue and we're able to install everything fine. Running @dstufft's script above pypi fails, as does imgur, both on the 52nd attempt, however google resolves without a problem.

Here's the error I get when I kill the pypi and imgur runs:

Traceback (most recent call last):
  File "pip_test.py", line 5, in <module>
    session.get("https://imgur.com/").content
  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 395, in get
    return self.request('GET', url, **kwargs)
  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 383, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 486, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 330, in send
    timeout=timeout
  File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 542, in urlopen
    body=body, headers=headers)
  File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 367, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python2.7/httplib.py", line 973, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python2.7/httplib.py", line 1007, in _send_request
    self.endheaders(body)
  File "/usr/lib/python2.7/httplib.py", line 969, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python2.7/httplib.py", line 829, in _send_output
    self.send(msg)
  File "/usr/lib/python2.7/httplib.py", line 791, in send
    self.connect()
  File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 111, in connect
    timeout=self.timeout)
  File "/usr/lib/python2.7/socket.py", line 562, in create_connection
    sock.connect(sa)
  File "/usr/lib/python2.7/socket.py", line 224, in meth
    return getattr(self._sock,name)(*args)

Here's the fastly debug:

H4sIAAAAAAAAA21Ta2/bIBT9K4hPW2cc/M5D+xDl0WXqsmpJ2kpLVRGb2qg2ZJg0aaP8912cdVnX
CluCey4H7rmHPS44y7iucXePv6ja4C7O+GqTu/esNuWTm6oKO3hRc036OZcW/6aeRVmyZStyKfpw
43k9tFhtpNn00IWQmx3ateO7OOwh/dgNqEs/onOePqhly6cehc9DY6H5vdotWxYG+n6a8rWlNnxn
lq3CVKXD1utSpMwIJZetnQ192r0JV2Xv12fqdpyzZeusmbb/0pELJvMNyznwckkWM4fLJiWClOF0
DmHvlDySqcqEzCGaP4s1AONGADIoBZRNJpeABG3Xo6Hr+ZHrQ8YNmYuKawBmXkjDIPSDMHCTII7D
IIoC52pGTzznXE0uyUCBUPppoDJ7rcXshM+4fgSRm3O8Tsf1EzdJXK8dnlLmvFqTm/H4vZtcMS1F
XQDkh7ST+EmS0AYYK71lOuOZnb3d+Ye6IU1ZWnBSP8Y+mX0dBPjg4G2hRH0nudkq/YC7P/dYwEJu
qiMTtQMRBFM/il5+oJWssgUOvp+PpnPSh0jBZFba2HQ0J0Gb0GbYDtSGmQ0YEA/BFqlB/bJUxw4D
mPE6hXPxD87qWuSygmYgIe+VrpoUBDNkClGjFex6QCmTaMUhupEZYgYYdFMDGNnaN1XW0SgMfO99
CN86ONWcGZ4dG+ERGhIvhuwUSsibMBg5JDQitA0a7V+VFhI/jtvJEIw+OunwFngl40tDqH0R/5/k
wzGExvhw+9IOtZXWdXusdM6keD5q1cWXs8mUGwdNZOr+qzjEYcmyTPO6eei10ZwbKyu8xggFXm3Q
zKDpdVO9ME+w6ZrVBbwH03TBtsgyDQewWKdH9/qU0sTe9ujpo58Ph8NvDZg8aVQEAAA=

We're using squid for the proxy with connect tunnelling on Ubuntu 12.04 but I don't have the version handy.

@gravyboat Can you run the script that @sigmavirus24 posted too please?

@dstufft I don't have urllib3, sorry just edited my above post, let me see about installing it.

@dstufft sorry, should have been more clear, it's specifically packages.urllib3 that's problematic.

pip uses requests which uses urllib3 :) The requests script that people tried earlier shows that this isn't specific to pip, so the problem must lie within requests OR urllib3, so the urllib3 script will hopefully tell us if it's a urllib3 problem or a requests problem.

@dstufft The issues was a debian (and therefore ubuntu) issue apparently. Using the following script resolved it, might want to have @sigmavirus24 edit his post as well:

from urllib3 import ProxyManager

proxy_url = 'http://localhost:8080'  # Just an example, place your proxy URL here please

pool = ProxyManager(proxy_url)
for i in range(100):
    print(i)
    pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*'}).read()

I received the following error:

Traceback (most recent call last):
  File "pip2_test.py", line 8, in <module>
    pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*'}).read()
  File "/usr/lib/python2.7/dist-packages/urllib3/response.py", line 193, in read
    e)
urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -2 while decompressing: inconsistent stream state',))

Seems odd that urllib3 would have an issue specifically with Fastly.

edit:

The script works perfectly fine when hitting https://google.com/

@gravyboat that's concerning that the urllib3 script behaves differently than requests. Did you check that you had used the right proxy? There's no reason it should fail and requests wouldn't.

@sigmavirus24 Yes I used the same proxy for all tests. The only constant between the failed tests is Fastly.

I faced this issue and it is because of communication between proxy and
pypi.python.org. The python-request package used by pypa kind of stops
creating connections after 50+ sessions. Hence you will face this issue.
Just tried changing globally pypi mirror to one of the active mirrors and
able to build venv successfully.

On Thu, Jul 3, 2014 at 8:02 AM, Forrest [email protected] wrote:

@sigmavirus24 https://github.com/sigmavirus24 Yes I used the same proxy
for all tests. The only constant between the failed tests is Fastly.

—
Reply to this email directly or view it on GitHub
https://github.com/pypa/pip/issues/1805#issuecomment-47860517.

Thanks,
IK

@ik-dev that doesn't help us in the slightest. Can you run the scripts that @dstufft and I have posted above to help us determine why this happens?

The issue is fixed in the latest urllib3 version 1.8.3. The Requests team should include the latest dev version in their code (currently somewhere between 1.8.2 and 1.8.3). I will ask them.

I actually took the time to trace the problem and found the culprit. The issue was the way that urllib3 poolmanager wraps the httplib connection and attaches the ssl tunnel information for the proxy environment:

https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L664-L683

The pool creates a new HTTPSConnection object (actually a monkey patched one with VerifiedHTTPSConnection) and set the direct connection to the proxy host the first time. The connection is reused in the pool for the subsequent calls correctly and the socket has the correct information (direct connection to proxy host, with HTTP CONNECT tunnel magic set by httplib).

However, eventually when you reuse the same connection for a large downloads, like the pypi index, then the socket gets closed in the httplib HTTPSConnection.getresponse(). Then the next call to HTTPSConnection.send() was causing an auto-reconnect, but then we lost the context of the proxy setup that was done in the urllib3 pool manager, because httplib has no knowledge of that (and those methods were not hooked by the VerifiedHTTPSConnection of urllib3).

In the latest version of urllib3, I see that they set the auto-reconnect to false in the created sockets, thus when the socket gets closed, a new connection is added to the pool with _new_conn().

So basically, if you want to test this fix, just pip install urllib3 and then create a symlink to this newer version from pip/_vendor/requests/packages/urllib3

@alexandrem Installed urllib3 1.8.3 and executed script shared above. But the behavior was same as mentioned by @sigmavirus24

[root@khanibdev ~]# pip install urllib3
Downloading/unpacking urllib3
Downloading urllib3-1.8.3.tar.gz (85kB): 85kB downloaded
Running setup.py (path:/tmp/pip_build_root/urllib3/setup.py) egg_info for package urllib3>
no previously-included directories found matching '_.pyc'
Installing collected packages: urllib3
Running setup.py install for urllib3
no previously-included directories found matching '_.pyc'
Successfully installed urllib3
Cleaning up...
[root@khanibdev ~]# ./trial.py
0
Traceback (most recent call last):
File "./trial.py", line 10, in
pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '_/_'}).read()
File "/usr/lib/python2.6/site-packages/urllib3/response.py", line 210, in read
"failed to decode it." % content_encoding, e)
urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -2 while decompressing: inconsistent stream state',))

@ik-dev
Try this script instead

from urllib3 import ProxyManager

proxy_url = 'https://your_proxy:3128'  # Just an example, place your proxy URL here please

pool = ProxyManager(proxy_url)
for i in range(100):
    print(i)
    res = pool.request("GET", "https://pypi.python.org/simple/").read()

When it works, you can try a new run of your 'pip install -r requirements.txt' with a symlink to this newer urllib3 inside your pip installation, to confirm that all is good.

I just ran @alexandrem script above with the latest pypi version of urllib3 and it worked fine.

I can also confirm that replacing the urllib3 version in pip's request folder with a symlink to the latest urllib3 fixes the Timeout issue when pip installing from a large requirements.txt

However, when I run the script posted by @sigmavirus24 I also get this error:
urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -2 while decompressing: inconsistent stream state',))

I've forked pip and updated urllib3 to version 3.1.9 (latest on pypi) and it works for us when installing a package with many dependencies through a proxy. Feel free to try it.
https://github.com/ctxis/pip

just to tag on, i had the same problem you are all having. i was able to fix it within pip, removing requests session pooling by forcing a new PipSession() with every HTTP GET. i was also able to fix it within _vendor/requests/adapters.py by changing DEFAULT_RETRIES from 0 to 10. @dgaus 's forked pip also works flawlessly, and thats my interim solution until pip updates. oh how i wish i found this thread sooner, after 2 full work days of wide-eyed debugging and tcpdumping and me blaming our proxy. i hate everything right now, and you all know what i mean :)

I had the same experience than @pcraston: ran the @alexandrem script with no problem (using urllib3 v 1.9.0, installed via pip), and @sigmavirus24 with the same error (using requests 2.3.0).

Same problem on Ubuntu installation, trying to install "localshop". Same problem on Windows workstation. There, I was able to install req by req on terminal.

On the "localshop" machine, the problem (same described above) happened on netaddr package. Then I installed it alone, netaddr==0.7.10, and several others then I was able to finish my localshop installation.

Yep, its a corporate firewall and I cannot change my environment.

If urllib3 indeed does solve the issue, if we can get some confirmation we can likely work with the requests team to get a patch release with that included so we can do the same before 1.6.

I forked a snapshot of requests, and then using that to replace it within a devpi virtualenv solved my problems with devpi.

…/bin/pip install -U "https://github.com/jhermann/requests/archive/proxy-fixed-for-devpi.zip#egg=requests"

I've been trying to make this work on a production test bed. I think ProxyManager doesn't work if your proxy uses "Username/Password" to use the proxy.

Traceback (most recent call last):
  File "test.py", line 8, in <module>
    pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*'}).read()
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/request.py", line 68, in request
    **urlopen_kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/request.py", line 81, in request_encode_url
    return self.urlopen(method, url, **urlopen_kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/poolmanager.py", line 261, in urlopen
    return super(ProxyManager, self).urlopen(method, url, redirect=redirect, **kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/poolmanager.py", line 153, in urlopen
    response = conn.urlopen(method, u.request_uri, **kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 579, in urlopen
    release_conn=release_conn, **response_kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 579, in urlopen
    release_conn=release_conn, **response_kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 579, in urlopen
    release_conn=release_conn, **response_kw)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/connectionpool.py", line 559, in urlopen
    _pool=self, _stacktrace=stacktrace)
  File "/appl/envprod/local/lib/python2.7/site-packages/urllib3/util/retry.py", line 265, in increment
    raise MaxRetryError(_pool, url, error)
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='pypi.python.org', port=443): Max retries exceeded with url: /simple/ (Caused by ProxyError('Cannot connect to proxy.', error('Tunnel connection failed: 407 authenticationrequired',)))

What version of urllib3 is that?

Just wanted to mention ran into this I think also have pip 1.5.4 on system created a virtualenv try to pip install -r req.pip going through a proxy (tinyproxy) got

Timeout: (, 'Connection to pypi.python.org timed out. (connect timeout=15)')

in the virtual inviroment I downgraded pip to pip 1.4.1 and it works now

I'm running into the same issue when trying to deploy my app on Docker using image "ubuntu:14.04" and "python3-pip", which is version 1.5.6. On my case, I don't know how to have pip 1.4 on my Ubuntu image. Any hint?

The best way would be to use your distro pip to install another pip version
:

pip install -U pip==1.4

Le vendredi 21 novembre 2014, Davi Garcia [email protected] a
écrit :

I'm running into the same issue when trying to deploy my app on Docker
using image "ubuntu:14.04" and "python3-pip", which is version 1.5.6. On my
case, I don't know how to have pip 1.4 on my Ubuntu image. Any hint?

—
Reply to this email directly or view it on GitHub
https://github.com/pypa/pip/issues/1805#issuecomment-64029177.

Alexandre

another +1 for getting the fix for this into a pip release as soon as possible

This affects me, too. Using a slow MITM SSL proxy results in timeout all the time, regardless of --default-timeout value.

I spent some time bisecting requests and urllib3, and the issue seems to be fixed upstream.

Specifically, the issue was fixed in shazow/urllib3@1c30a1f3a4af9591f480a338f75221bdf5ca48da, which was released in 1.8.3, which was vendored by requests in 2.4.0, which was in turn vendored by pip in 6.0.

Using an affected version of requests reliably fails to use the proxy after 24 get requests to pypi and pip reliably fails to use the proxy after 12 packages have been installed. Upgrading them fixes the problem for me.

It is fixed in pip 1.6

Everyone should upgrade :)

Le lundi 19 janvier 2015, Stratos Moros [email protected] a écrit :

I spent some time bisecting requests and urllib3, and the issue seems to
be fixed upstream.

Specifically, the issue was fixed in shazow/urllib3@1c30a1f
https://github.com/shazow/urllib3/commit/1c30a1f3a4af9591f480a338f75221bdf5ca48da,
which was released in 1.8.3, which was vendored by requests in 2.4.0, which
was in turn vendored by pip in 6.0.

Using an affected version of requests reliably fails to use the proxy
after 24 get requests to pypi and pip reliably fails to use the proxy after
12 packages have been installed. Upgrading them fixes the problem for me.

—
Reply to this email directly or view it on GitHub
https://github.com/pypa/pip/issues/1805#issuecomment-70536334.

Alexandre

Closing this since it's fixed in pip 6.x.

I'm seeing identical behavior to what everyone is reporting here in Pip 7.1 (installed by default with the installer for Python 2.7.10) on Windows 7. Is this the same issue? Was this fixed on Windows?

I also get this problem from within our company Windows network with different versions of pip.

Using pip 6.0.8 from C:\Python34\lib\site-packages (python 3.4) I get this output on the console:

C:\src\set>c:\Python34\Scripts\pip.exe install cx_Freeze
Collecting cx-Freeze
  Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=3, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=2, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=1, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=0, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=3, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=2, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=1, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Retrying (Retry(total=0, connect=None, read=None, redirect=None)) after connec
tion broken by 'ProxyError('Cannot connect to proxy.', OSError('Tunnel connectio
n failed: 407 Proxy Authentication Required ( The ISA Server requires authorizat
ion to fulfill the request. Access to the Web Proxy filter is denied.  )',))': /
simple/cx-freeze/
  Could not find any downloads that satisfy the requirement cx-Freeze
  No distributions at all found for cx-Freeze

Using pip 1.0.1 from c:\python27\lib\site-packages (python 2.7) I got this pip.log file content:==

C:\Python27\Scripts\pip-script.py run on 08/11/15 16:36:48
Downloading/unpacking cx-Freeze
  Getting page http://pypi.python.org/simple/cx_Freeze
  Could not fetch URL http://pypi.python.org/simple/cx_Freeze: HTTP Error 407: Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied.  )
  Will skip URL http://pypi.python.org/simple/cx_Freeze when looking for download links for cx-Freeze
  Getting page http://pypi.python.org/simple/
  Could not fetch URL http://pypi.python.org/simple/: HTTP Error 407: Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied.  )
  Will skip URL http://pypi.python.org/simple/ when looking for download links for cx-Freeze
  Cannot fetch index base URL http://pypi.python.org/simple/
  URLs to search for versions for cx-Freeze:
  * http://pypi.python.org/simple/cx_Freeze/
  Getting page http://pypi.python.org/simple/cx_Freeze/
  Could not fetch URL http://pypi.python.org/simple/cx_Freeze/: HTTP Error 407: Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied.  )
  Will skip URL http://pypi.python.org/simple/cx_Freeze/ when looking for download links for cx-Freeze
  Could not find any downloads that satisfy the requirement cx-Freeze
No distributions at all found for cx-Freeze
Exception information:
Traceback (most recent call last):
  File "C:\Python27\lib\site-packages\pip\basecommand.py", line 126, in main
    self.run(options, args)
  File "C:\Python27\lib\site-packages\pip\commands\install.py", line 223, in run
    requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
  File "C:\Python27\lib\site-packages\pip\req.py", line 954, in prepare_files
    url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
  File "C:\Python27\lib\site-packages\pip\index.py", line 152, in find_requirement
    raise DistributionNotFound('No distributions at all found for %s' % req)
DistributionNotFound: No distributions at all found for cx-Freeze

This is still an issue in pip version 7.1.2.

I am seeing it for index_urls though.

For example if you try and download pandas from an index, pandas will require numpy which pip will then try and get from PYPI instead of the specified index_url

details are:

Python 2.7
Pip 7.1.2
Redhat Centos 6.6

@AndiEcker, Have you found any solution for that problem? Even I face the same issue from my office. Very much appreciate your help.

This is the error:

  Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connec
tion broken by 'ConnectTimeoutError(<pip._vendor.requests.packages.urllib3.conne
ction.VerifiedHTTPSConnection object at 0x00000000033A19E8>, 'Connection to pypi
.python.org timed out. (connect timeout=15)')': /simple/pydicom/

@iamnagesh : the solution for my case was that our network administrator had to gave me access to the internet without a proxy.

But your error message is looking different anyway - sorry but don't know why your connection is showing a timeout error - if you have a slow internet connect then maybe try to increase the timeout (unfortunately I don't know how to do this).

For the previous errors involving http proxy authentication via NTLM, I would suggest using cntlm as a workaround.

Setting it locally with a NTLM hash for your proxy password really solves a lot of different issues with apps and libraries that don't support very well corporate http/https proxies.

You just have to set your proxy env vars and app settings to http://localhost:3128

@iamnagesh I am facing the same error while installing a package through pip in anaconda prompt.
I set up environmental variable HTTP and HTTPS for proxy. Still I have same error. Can you please tell what you did to solve the issue.

@ashi-taka: in my case our network admins fixed it for me and AFAIR the fix
was that our network admins totally removed any proxy in our company
network.

2016-04-06 6:00 GMT+01:00 ashi-taka [email protected]:

@AndiEcker https://github.com/AndiEcker I am facing the same error
while installing a package through pip in anaconda prompt.
I set up environmental variable HTTP and HTTPS for proxy. Still I have
same error. Can you please tell what you did to solve the issue.

—
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
https://github.com/pypa/pip/issues/1805#issuecomment-206118777

Though the proxy confiuration was added at system level, due to some reason _pip_ was was giving the error _[Errno 101] Network is unreachable_

Using proxy option fixed the issue.( thanks to @pcraston )

sudo pip install pandas --proxy=proxy:8080

Ubuntu 14.04
pip 8.1.2

Running pip with --isolated key solved the problem for me.
pip3 8.1.1

Got the same errors with pip 9.0.1 @ windows 10
Problem was with proxy, it was set as "" (empty, but flagged as used proxy) at system wide level.

[update] It worked! I've found the line, it's now 178. Thanks.


Hi all
can somebody help me? I've tried to install mkdocs and got this error:

C:Users\Lenovo Local>pip install mkdocs
Collecting mkdocs
Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'ConnectTimeoutError(, 'Connection to pypi.python.org timed out. (connect timeout=15)')': /simple/mkdocs/
...
@alexandrem this solution doesn't work anymore because line 209 is empty - any other idea??
(commented on Jul 2, 2014)I just add a True to the condition on line 209 of the adapter.py to always create a ProxyManager instance, thus skipping the pool manager logic.

Python 3.5.3
pip 9.0.1
Thanks.

Hi, I am not able to install lightgbm in spyder. I get the followinf error...would appreciate any help on how to fix this. Yes, there is a proxy at work. Thanks..!

Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno 11002] getaddrinfo failed',)': /simple/lightgbm/

Well, seems you got DNS trouble, which pip cannot do anything about. Also, try to avoid appending to closed tickets, that always makes things intransparent.

pip install -U requirement --proxy=my_proxy:port

it worked!!

Hi Senani, did not work for me. Did I code correctly? Let's say I need to download implicit modile.

pip install implicit --proxy=sm_proxy:xx28

Also I am behind a pacs firewall. Each time to use internet, I need to enter logina and password.

Thanks a ton.

So I had a similar issue to this where it was only failing on pip installs from git repo. This only failed when i was behind a proxy. I was not able to get it resolved for git installs over http. I ended up changing my install to use ssh which worked for my environment.

Works behind proxy

pip install git+ssh://[email protected]/path/to/repo@tag#egg=package-name

Doesn't work behind proxy

pip install git+git://github.com/path/to/repo@tag#egg=package-name

Below pip command worked for me.

sudo pip --proxy http://proxy_username:proxy_password@proxyhost:port install

In my case problem was with PIP_INDEX_URL variable.
--isolated helped.

the --isolated flag worked for me when i'm not behind a proxy but the Cannot connect to Proxy error keeps coming up!

Just try the below
pip install --proxy=user:pass@server:port <package Name>

for example
pip install --proxy=http://10.10.10.150/accelerated_pac_base.pac quandl

In my case both solutions work:

  • set proxy in pip pip install --proxy=user:pass@server:port <package Name>
  • or setting HTTP_PROXY & HTTPS_PROXY then pip install <package Name>

In my case both solutions work:

  • set proxy in pip pip install --proxy=user:pass@server:port <package Name>
  • or setting HTTP_PROXY & HTTPS_PROXY then pip install <package Name>

I already had http_proxy and https_proxy set, but was getting connection failures. Simply setting up HTTP_PROXY and HTTPS_PROXY worked for me.

I was able to resolve the issue by increasing resources on my proxy server. Initially, my proxy server was running 2 "servers" and allowed 5 "client" connections. After increasing to 10 servers and 20 client connections, pip installed packages perfectly.

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

Was this page helpful?
0 / 5 - 0 ratings