Conan: Issues with proxies

Created on 15 Jan 2018  Â·  47Comments  Â·  Source: conan-io/conan

I've started Conan server in my organization's local network. For external sites (like github) proxy must be used, but not for computers in local network.
So, I have there env variables:

HTTP_PROXY=http://proxy.domain:3128
HTTP_PROXY=https://proxy.domain:3128
NO_PROXY=localhost,127.0.0.1,*.domain,192.168.*

With these variables I can create packages with recipes in my local cache (and this recipes can download external sources), but cannot download recipes and prebuilt packages from organization's Conan server.

I've tried to set same settings in conan.conf, but had no luck. Also I've tried to use Conan server domain name and/or IP address in NO_PROXY with same result:

> conan search -r <remotename> "OpenCV*"
ERROR: Permission denied for user: '<username>'. [Remote: <remotename>]

But if I disable all proxies (comment in conan.conf and unset env variables), it works just fine:

Existing package recipes:

OpenCV/3.3.1@lukyanets/testing

Conan v. 1.0.1, OS: Arch Linux, CentOS 6, Windows 10.

Feedback please!

Most helpful comment

Hello @leugenea @sourcedelica

I wanted to update on this issue. A big refactor over the http layer has been done, and some issues in the underlying python-requests library regarding to proxies have become more evident. Then, some extra logic has been added, that seems to work much better, specially regarding the "no-proxy" configuration.

The changes are already in develop, they will be released next week in 1.2, but it would be great if you could run from sources from develop and try. Basically, you can now configure no-proxy patterns in your conan.conf with no_proxy_match:

[proxies]
# http = http://user:[email protected]:3128/
# http = http://10.10.1.10:3128
# https = http://10.10.1.10:1080
# You can skip the proxy for the matching (fnmatch) urls (comma-separated)
no_proxy_match = *bintray.com*, https://myserver.*

It would be great if you could run from sources, develop branch, and give it a try in your environment. Many thanks!

All 47 comments

I don't know if it's the problem but it seems that the curl syntax of the NO_PROXY doesn't allow * wildcard symbols and it's used only for domain names. Could you try to specify the NO_PROXY with the full IP of your server to confirm that?
The domain syntax should be just .domain or domain

@lasote , same result:

> export NO_PROXY="192.168.6.221"
> conan search -r <remotename> "OpenCV*"
ERROR: Permission denied for user: '<username>'. [Remote: <remotename>]

But the permission denied doesn't look like a proxy error, right? Am I missing something?

Yes, it does not. But it's not a problem with server configuration or local configuration as without any proxy setting (neither in conan.conf, nor in environment) everything works fine:

> conan search -r <remotename> "OpenCV*"
ERROR: Permission denied for user: '<username>'. [Remote: <remotename>]
> export http_proxy=""                     
> conan search -r <remotename> "OpenCV*"
Existing package recipes:

OpenCV/3.3.1@lukyanets/testing

I have no idea what could be happening, could you try to just curl the remote URL and try to gather if the server is really responding to that URL? (Change the port if needed)

curl -vi http://192.168.6.221:9300/v1/conans/search?q=OpenCV*
curl -vi http://192.168.6.221:9300/v1/ping

If I try to open links in browser (Chrome/Firefox, same proxy settings) everything goes fine, using curl I'm getting a page with denied access, this is because curl uses proxy to connect to server in organization's local network (our proxy acts like this), for connections like this we cannot use our proxy server.

  • For export no_proxy="*.domain" I get same proxy error.
  • For export no_proxy="server-name.domain" I get correct answer for http://server-name.domain:9300/v1/conans/search?q=OpenCV*, but for conan search -r <remotename> "OpenCV*" I get same ERROR: Permission denied for user: '<username>'. [Remote: <remotename>].
  • For export no_proxy="server-name.domain,localhost,127.0.0.1,*.domain,192.168.*" result same as for export no_proxy="server-name.domain".
  • For export no_proxy="192.168.*" I get same proxy error.
  • For export no_proxy="192.168.6.221" result same as for export no_proxy="server-name.domain".

Looks like curl parse no_proxy variable incorrectly. But even if curl get correct response for curl -vi http://192.168.6.221:9300/v1/ping, conan search … doesn't work.

@lasote , any suggestions?

Sorry for not answering, I have no idea what can be happening and cannot reproduce it. :(

I can only suggest to run Conan from sources and debug printing in the client/rest/rest_api.py module, the search method and try to guess what is trying to do

It's very strange that there's no one has same issues. I suppose that proxy in my company is pretty standard, that would mean that whole proxy-related stuff not working.
I've returned to this issue again (4 weeks later) but still cannot figure out what I'm doing wrong :(

Yes, it is weird. We are aware of some users using conan in their organizations with proxies, so don't know what could be happening in your case.

But if I disable all proxies (comment in conan.conf and unset env variables), it works just fine:

So the issue is that conan should also be accessing resources from outside the local network? A common case is that users use conan completely in their local network. Which are the external resources that fail? Could you please share the output of those failures?

It is a bit unfortunate, this kind of error is very difficult to debug, because we cannot reproduce it, and might also be related to something specific in the network. Do you have accessible the information from the proxy? Can you get the proxy config and logs?

@memsharded , yes, problem is accessing external resources. It's critical as we want to download sources (or binaries) for packages. Good example is Boost: it has prebuild version for Windows (and we want to just download it) and sources for other platforms.
I do not have access to proxy configuration and logs, but that's what I know: connection to all internal servers (*.domain or with internal IPs) must be done without proxy so if I'll try to reach host our-conan-server-hostname.domain through proxy, I get error ERROR: Permission denied for user: '<username>'. [Remote: upload_repo].

I have a similar issue.

I am trying to download using conan config install from a local server behind the firewall.

If proxies are set in conan.conf it will always use them. It does not respect the no-proxy entry, nor does it respect the NO_PROXY environment variable. If I remove the proxies from conan.conf then it conan config install works.

If the conan.conf contains proxies, no matter what no-proxy or NO_PROXY is set to, then I get this:

$ conan config install http://bitbucket-idb:7990/projects/DEAL/repos/public/raw/conan/linux-config.zip         Trying to download  http://bitbucket-idb:7990/projects/DEAL/repos/public/raw/conan/linux-config.zip
[==================================================] 1.3KB/1.3KB
Traceback (most recent call last):
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/command.py", line 1131, in run
    method(args[0][1:])
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/command.py", line 332, in config
    return self._conan.config_install(args.item, verify_ssl)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/conan_api.py", line 64, in wrapper
    return f(*args, **kwargs)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/conan_api.py", line 444, in config_install
    return configuration_install(item, self._client_cache, self._user_io.out, self._runner, verify_ssl)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/conf/config_installer.py", line 119, in configuration_install
    _process_download(item, client_cache, output, tmp_folder, verify_ssl)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/conf/config_installer.py", line 97, in _process_download
    _process_zip_file(zippath, client_cache, output, tmp_folder, remove=True)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/conf/config_installer.py", line 51, in _process_zip_file
    unzip(zippath, tmp_folder)
  File "/develop/anaconda2/lib/python2.7/site-packages/conans/client/tools/files.py", line 78, in unzip
    with zipfile.ZipFile(filename, "r") as z:
  File "/develop/anaconda2/lib/python2.7/zipfile.py", line 770, in __init__
    self._RealGetContents()
  File "/develop/anaconda2/lib/python2.7/zipfile.py", line 813, in _RealGetContents
    raise BadZipfile, "File is not a zip file"
BadZipfile: File is not a zip file

ERROR: File is not a zip file

It's saying that it's not a zip file because it gets a error page from the proxy and tries to unzip it.

I tried setting up a test script to debug this. It is calling config_installer._process_download() which is what conan config install does:

import sys
import os
from conans import paths
from conans.client import client_cache
from conans.client.conf import config_installer
from conans.client.output import ConanOutput


def main():
    item = "http://bitbucket-idb:7990/projects/DEAL/repos/public/raw/conan/linux-config.zip"

    output = ConanOutput(sys.stdout, True)
    user_home = paths.get_conan_user_home()
    cache = client_cache.ClientCache(user_home, None, output)
    tmp_folder = os.path.join(cache.conan_folder, "tmp_config_install")

    config_installer._process_download(item, cache, output, tmp_folder, False)


if __name__ == "__main__":
    main()

Interestingly this works! It downloads fine, no matter what proxies are set in conan.conf.

$ python test_unzip.py
Trying to download  http://bitbucket-idb:7990/projects/DEAL/repos/public/raw/conan/linux-config.zip
[==================================================] 3.8KB/3.8KB
Unzipping 1.3KB
Unzipping 100 %                                                       Defining remotes
Installing profiles
    Installing profile default
    Installing profile win-vs8-release
    Installing profile win-vs15-debug
    Installing profile linux-gcc5-debug
    Installing profile win-vs15-release
    Installing profile msys-gcc6-debug
    Installing profile win-vs8-debug
    Installing profile linux-gcc5-release

I found that the difference is, when running my test script, when it gets to the line C:\Users\epederson\AppData\Local\Programs\Python\Python36\Lib\site-packages\conans\client\tools\net.py:55, the then _global_requester is a

<module 'requests' from '/develop/anaconda2/lib/python2.7/site-packages/requests/__init__.pyc'>

However, if I run conan config install then _global_requester at that same line is a

<requests.sessions.Session object at 0x7ff500c23e50>

My test script is probably working because it doesn't process conan.conf and set _global_requster.

Hello @leugenea @sourcedelica

I wanted to update on this issue. A big refactor over the http layer has been done, and some issues in the underlying python-requests library regarding to proxies have become more evident. Then, some extra logic has been added, that seems to work much better, specially regarding the "no-proxy" configuration.

The changes are already in develop, they will be released next week in 1.2, but it would be great if you could run from sources from develop and try. Basically, you can now configure no-proxy patterns in your conan.conf with no_proxy_match:

[proxies]
# http = http://user:[email protected]:3128/
# http = http://10.10.1.10:3128
# https = http://10.10.1.10:1080
# You can skip the proxy for the matching (fnmatch) urls (comma-separated)
no_proxy_match = *bintray.com*, https://myserver.*

It would be great if you could run from sources, develop branch, and give it a try in your environment. Many thanks!

Hi - I tried running from sources and I got the same thing:

(conan-dev)
epederson@dev-idb19 ~
$ conan-dev config install
WARN: Migration: Updating settings.yml
WARN: ****************************************
WARN: A new settings.yml has been defined
WARN: Your old settings.yml has been backup'd to: /userhome/epederson/.conan/settings.yml.backup
WARN: ****************************************
Trying to download  http://bitbucket-idb:7990/projects/DEAL/repos/public/raw/conan/linux-config.zip
[==================================================] 1.3KB/1.3KB
Traceback (most recent call last):
  File "/userhome/epederson/work/conan/conans/client/command.py", line 1159, in run
    method(args[0][1:])
  File "/userhome/epederson/work/conan/conans/client/command.py", line 338, in config
    return self._conan.config_install(args.item, verify_ssl)
  File "/userhome/epederson/work/conan/conans/client/conan_api.py", line 59, in wrapper
    return f(*args, **kwargs)
  File "/userhome/epederson/work/conan/conans/client/conan_api.py", line 457, in config_install
    return configuration_install(item, self._client_cache, self._user_io.out, verify_ssl)
  File "/userhome/epederson/work/conan/conans/client/conf/config_installer.py", line 126, in configuration_inst        all
    _process_download(item, client_cache, output, tmp_folder, verify_ssl)
  File "/userhome/epederson/work/conan/conans/client/conf/config_installer.py", line 104, in _process_download
    _process_zip_file(zippath, client_cache, output, tmp_folder, remove=True)
  File "/userhome/epederson/work/conan/conans/client/conf/config_installer.py", line 58, in _process_zip_file
    unzip(zippath, tmp_folder)
  File "/userhome/epederson/work/conan/conans/client/tools/files.py", line 78, in unzip
    with zipfile.ZipFile(filename, "r") as z:
  File "/userhome/epederson/.conda/envs/conan-dev/lib/python3.6/zipfile.py", line 1108, in __init__
    self._RealGetContents()
  File "/userhome/epederson/.conda/envs/conan-dev/lib/python3.6/zipfile.py", line 1175, in _RealGetContents
    raise BadZipFile("File is not a zip file")
zipfile.BadZipFile: File is not a zip file

ERROR: File is not a zip file

Here is my conan.conf proxy section:

[proxies]
http = http://hfcproxy.tradeweb.com:8080
https = http://hfcproxy.tradeweb.com:8080
no_proxy_match = http://bitbucket-idb:7990*

Here is my launcher:

#!/usr/bin/env python
import sys
conan_sources_dir = '/userhome/epederson/work/conan'

sys.path.insert(1, conan_sources_dir)
# Or append to sys.path to prioritize a binary installation before the source code one
# sys.path.append(conan_sources_dir)

from conans.conan import main
main(sys.argv[1:])

Confirming that I'm on the develop branch:

epederson@dev-idb19 ~/work/conan (develop)
$ git branch
* develop

Thanks @sourcedelica for the report,

I guess the unzipping is failing because the proxy is not returning an error, but 200 and a web page.

It might be related to just conan config install that might not be loading the proxies configuration completely. Could you please check other things, like regular uploads/downloads of packages? Thanks!

But it shouldn't be going through the proxy because of the no-proxy-match, right?

yes, yes, it shouldn't. I have to check it, but it might be that the no-proxy configuration is not being injected to the conan config install command, that uses a separate flow from the rest of conan. I am checking it asap.

Looks like new changes don't work for me too.

http = http://<proxy.domain>:3128
https = http://<proxy.domain>:3128
no_proxy_match = http://<server.domain>:9300*

I get same error

conan search -r <remotename> "OpenCV*"
ERROR: Permission denied for user: '<username>'. [Remote: <remotename>]

@leugenea you ran a conan user <username> -p <password> -r <remotename> before the search, did you?

@memsharded , it returns me proxy's error page with ERR_ACCESS_DENIED to address http://<server.domain>:9300/v1/users/authenticate .

I am running a couple of checks in the develop branch, and it seems the no_proxy_match is behaving as expected.

I am re-reading this thread, and I can't figure out what is happening. If you could compose a full curl command line that works for you (with the search url for @leugenea and for the zip file in local bitbucket for @sourcedelica) it could help to debug this issue. I have read above that @leugenea case worked in the browser but not in curl, but I didn't undestand the reason. If it cannot run with curl, then it is unlikely that conan will be able to communicate with it either, I guess.

I keep investigating this, thanks for all your feedback.

For reference, I have done the following setup:

    server {
        listen       80;
        server_name  localhost;

        #charset koi8-r;

        #access_log  logs/host.access.log  main;


        location / {
            proxy_pass http://localhost:8081;
        }
  • A client with:
[proxies]
http = http://localhost:80
#https = http://<proxy.domain>:3128
no_proxy_match = http://localhost:9300*
  • My remotes are:
localhost: http://localhost:9300 [Verify SSL: True]
art: http://localhost:8081/artifactory/api/conan/conan-local [Verify SSL: True]

I have verified, with conan search * -r={localhost,art}, that:

  • Localhost doesn't pass through proxy, working in develop
  • Artifactory does pass through proxy, working in develop
  • Localhost fails in conan 1.1.1

So it seems that indeed the current develop is an improvement. I am not sure what could be happening in your cases.

Could it be related to auth in both cases? What is the authentication required for the proxies? Thanks!

@lasote, I'll try to create curl command tomorrow.
My proxy doesn't have any authentication.

@lasote , I've tried several variants and looks like

curl --noproxy "<server.domain>,<domain>" http://<server.domain>:9300/v1/conans/AndroidNDK/r15c/lukyanets/testing/search
curl --noproxy "<server.domain>" http://<server.domain>:9300/v1/conans/AndroidNDK/r15c/lukyanets/testing/search
curl --noproxy ".<domain>" http://<server.domain>:9300/v1/conans/AndroidNDK/r15c/lukyanets/testing/search

are working as expected, but something with * doesn't:

curl --noproxy "*.<domain>" http://<server.domain>:9300/v1/conans/AndroidNDK/r15c/lukyanets/testing/search

I got a hint of how it works here: https://stackoverflow.com/a/46848900 .

We are not using proxy authentication.

It looks like you have configured Nginx as a reverse proxy. For this testing you need to set up a forward proxy (aka HTTP proxy aka proxy server). Here an explainer on the difference between forward and reverse proxy.

One way to check would be to set your browser's proxy configuration to http://localhost:80 and then try to request http://localhost:9300 and http://localhost:8081 using your browser and see if it works.

I found a blog post for how to configure Nginx as a forward proxy. Another option is Tinyproxy. There are some other ones listed in that Stack Overflow post.

Hi @sourcedelica

Not sure about the forward/reverse. Following the SO post, the reverse proxy is hidden from the client, while it is clear in my case, that I am routing requests through my proxy, from the client, and specifically I have to use the no_proxy feature to get to the local one. Also the configuration you linked is using the same one that I used in my Nginx.

I am not an expert at all, so I might be misunderstanding something, but I still think that I set up a forward proxy.

Hi @memsharded -

When you use your web browser to access Artifactory through the proxy, what URL are you using?

Also, when you use your web browser to access the local Conan server through the proxy, what URL are you using?

Did you make any changes to your browser configuration to use the proxy?

When using a forward proxy you would:

  1. Change your browser settings to use the proxy
  2. In the browser request the actual URLs (localhost:9300 or localhost:8081)
  3. The browser will intercept the request and send it to the proxy configured in step 1

Thanks,

With http://localhost:80 in my browser I see artifactory, without needing to configure anything in the browser itself.

I didn't need to to any changes in the browser configuration (actually it links to the system one in windows) for the above setup.

If I configure the proxy in the "internet settings" (via browser configuration), then, all my navigation is directed to Artifactory, and I am seeing artifactory in every domain.com/ I type in the browser URL (as defined in the proxy)

Ok - that's what you would expect with a reverse proxy.

A reverse proxy routes all requests to the back end server (in your case proxy_pass http://localhost:8081). It is typically used to present a single URL to browsers and then route the requests to the real web server(s) on the backed.

A forward proxy routes requests to the requested URL. Except that some requests might get blocked, like Facebook, or a request for malware.exe. A forward proxy can also do malware scanning on responses.

Could you try configure Nginx as a forward proxy or alternatively Tinyproxy.

Thanks for the info, on my way to try it and debug it. Crossed fingers to be able to reproduce the issue!

I installed a Tinyproxy, blocking all domains by default. I try to use conan with this configuration and every request (download zip, conan search etc) are blocked and fail:

[proxies]
http = http://localhost:8888
https = https://localhost:8888

Then using no_proxy_match excluded, for github and bintray:

[proxies]
no_proxy_match = *bintray.com*, *github*
http = http://localhost:8888
https = https://localhost:8888

Works as expected searching in bintray or downloading a file from github.

By the way, of course, if I adjust the proxies at operating the system level, (in my case a mac), the requests will pass by the proxy unless I adjust an exception in the operating system control panel, even using the "no_proxy_match" that only applies to the conan specified proxies.

Could you try conan config install with no_proxy_match matching the server that hosts the configuration zip file? My problem is that Conan appears to be using the proxy for conan config install even though I have no_proxy_match configured.

I wasn't sure what you meant by

By the way, of course, if I adjust the proxies at operating the system level, (in my case a mac), the requests will pass by the proxy unless I adjust an exception in the operating system control panel, even using the "no_proxy_match" that only applies to the conan specified proxies.

Does Conan use the proxy settings from the OS control panel, ignoring config.conf, or the opposite, Conan always uses config.conf?

I tried with conan config install too. Working good.
I meant, the no_proxy_match only disable the proxies declared in conan.conf. If the operating system has global proxy configuration I don't know if conan can do anything with it.

Rats! Thanks for all of your effort to try to replicate it. I will keep working on my end to see if I can figure out what's going on.

I'm moving this to the 1.3 release to see if we can finally guess what is happening. Don't worry, we won't quit until it's solved!!

@lasote , I've returned to this issue again. Looks like if no environment variables are set, conan.conf works just fine, but if I have my usual environment variables http_proxy/https_proxy/no_proxy, they're used and looks like settings from conan.conf are ignored.
I've googled again and found this issue: https://github.com/requests/requests/issues/879 and this comment . Maybe we could try "not to trust env" in cases when one of http_proxy/https_proxy/no_proxy_match is defined? Or, even better, "not trust env" and use for connection values of environment vars for not defined in conan.conf vars. Like this:

$ set http_proxy=foo.bar:3128
$ set https_proxy=foo.bar:3128
$ set no_proxy="*.bar"
$ cat ~/.conan/conan.conf
…
no_proxy_match = *.bar*"

In this case we do something like this

s = requests.Session(config={'trust_env': False})
proxies = {'http': os.environ['http_proxy'],
           'https': os.environ['https_proxy'],
           'no': <getter from conan.conf>}
s.proxies.update(proxies)

I know this looks more like a "hack", but I think that it can solve most of problems.

Thanks for the feedback. yes, probably we could study to "clean" the environment in case a proxy setting at conan level is set.

I will try to take a look at it for the 1.5 because it is an old issue, but we are very late so I cannot promise.

Great news! Thanks @lasote !
Is it merged into 1.5 release? Any guesses about it's release date?

During the week

Thanks!

Not very great news =( Configuration:

http = http://<proxy.domain>:3128
https = http://<proxy.domain>:3128
no_proxy_match = http://<server.domain>:9300*

doesn't work.
Using *<server.domain>*, *<.domain>* as no_proxy_match or disabling commenting http, or even disabling http AND https doesn't work too.

I have the same issue as @leugenea.

My config:
ubuntu 18.04
python 3.6
conan 1.5.2

@fnjeneza do you have also the environment variables set? Does it work removing the env vars? (http_proxy, https_proxy, no_proxy)

Was there ever a solution to this issue of no_proxy_match not being used? I'm running into the same issue.

Was this page helpful?
0 / 5 - 0 ratings