cc @isuruf @jakirkham
related to conda-forge
cc @CJ-wright
Why?
How does it break conda-forge?
You've stated it's a release blocker, so, I'm curious as to what makes it one.
Agreed. Someone's going to have to explain why build isolation breaks conda-forge (and why conda-forge cannot fix it at their end) before this can be considered a release blocker.
The link to https://github.com/conda-forge/staged-recipes/pull/5200 seems irrelevant - I can see no reference to this issue in there.
Unrelated to the issue of whether this is a release blocker, I'm not keen on users being able to disable build isolation. My recollection is that we added build isolation specifically because non-isolated builds caused issues (I can't find any docs on build isolation - we should probably get better at requiring doc updates that describe the feature and the reasons for it when landing significant changes like that...) so adding a flag to disable it seems likely to just open the door to those issues again. That just reinforces my interest in seeing a good justification for a flag like this.
The entire conda-forge architecture is based on internal build dependencies. In other words, packages need to be built in an environment of conda packages, because they may not be binary compatible with pypi packages. In addition, conda-forge has recommended the use of pip to build packages, rather than setup.py install
.
I'm almost tempted to close this issue because no one in charge of conda-forge seems to care about it, and they probably won't until they realize the implications until builds start to break after pip 10 is released. Let me try one more time to garner some interest.
cc @isuruf @cj-wright @jakirkham
@xoviat, can you explain to me the issue. I'm not familiar with what build isolation is.
pip 10 will fetch build dependencies from pypi, even if dependencies are available locally. Do you understand?
Let me rephrase. pip 10 will create another "conda environment" composed of exclusively pypi packages, and then run the build inside of that.
Thanks. That's going to be an issue with conda-build in general and break building conda packages.
cc @msarahan
Yes, if pip doesn't care about the local environment, that will be hostile to the conda packaging ecosystem. It will hide errors in conda recipes and potentially use binary incompatible packages, depending on how wheels move forward with use of shared libraries.
Please add the option that @xoviat suggests here, or let us know and we'll do it.
Thanks @xoviat for raising the issue.
I'm not too familiar with the details so bear with me please. IIUC, you build a package's build-dependencies before building a package and then build that package in an environment with exact those dependencies installed, correct?
It is be possible to tell pip, to not hit PyPI and just use wheels available in a local directory when building a certain package that does use the new "build isolation". That way you can control which build-depedencies are installed in the temporary isolated build environment that pip creates for building the wheel from an sdist. Does that resolve your concern with build isolation @msarahan?
@xoviat am I missing some detail here? -- you probably also know all this and have still filed this issue.
It is be possible to tell pip, to not hit PyPI and just use wheels available in a local directory when building a certain package that does use the new "build isolation". That way you can control which build-depedencies are installed in the temporary isolated build environment that pip creates for building the wheel from an sdist. Does that resolve your concern with build isolation @msarahan?
No, because conda does not build wheels.
We don't use wheels at all. What we provide instead is a full environment - python, package dependencies, everything. We take a snapshot of the files in that "prefix," and then we run pip install --no-deps .
using the python interpreter in that environment. We take another snapshot after that - new files are what get bundled up.
Yeah, IIUC build isolation is going to be a big problem not just for conda and conda-forge, but other packagers as well. Copying some other folks that might be interested in this issue. Sorry in advance if I'm being overly concerned about this and misestimating its area of effect.
cc @adamjstewart (Spack) @warsaw (Debian) @junghans (Gentoo) @FRidh (Nix) @jimfulton (Buildout)
Why not use setup.py bdist_wheel
followed by pip install --no-deps the_wheel
?
@benoit-pierre As I'm sure you're aware, setup.py bdist_wheel is not a reliable abstraction given that PEP 517 will be coming out "soon."
Yeah, the wonder of multiple build systems....
@jakirkham Thanks for pinging me. Spack builds all Python packages using python setup.py build
and python setup.py install
, not using pip
, so I don't _think_ this will be an issue for us?
Let me bring more Spack people into the mix just to confirm: @tgamblin @scheibelp @lee218llnl
@adamjstewart Since you're here, are you aware of the coming PEP 517 modifications to packaging infrastructure?
The one-sentence summary is that we're going to make setup.py install
implementation-defined and require pip to install packages (or more specifically, a compatible frontend).
Looking at https://github.com/pypa/pip/issues/3691#issuecomment-263052471, @njsmith mentions that there should be a flag to disable build isolation.
Seems like this is an item that was missed, in the list of things to be implemented for PEP 518.
@pradyunsg This shouldn't really be a complicated fix; I think it's just checking flags in operations/prepare.py
.
@xoviat I wasn't aware of PEP 517. I won't claim that I understand most of it, but we may have to reconsider how we install packages. It sounds like we'll either need to wrap around pip or become a compatible frontend for the small subset of packages that decide to take advantage of PEP 517? We weren't wrapping around pip because Spack needs to manually manage dependencies, downloads, and installation directories, but if this flag is added perhaps we could make use of pip?
We are running into several problems with our manual Python installations involving RPATHs and backports (https://github.com/spack/spack/pull/4545, https://github.com/spack/spack/issues/7370). I imagine that pip has solved these issues long ago, so that may simplify our implementation greatly.
Anyway, I don't want to derail this converation any further. Thanks for letting me know about these changes. Some of the other Spack developers may have more to say about this.
@pradyunsg - In @njsmith's comment, he was talking about broken build requirements, which is a relatively niche issue. Conda-forge and Spack sound like they might be heavy users of such an option, to the point where it's going to be required for them. But if it's a simple enough fix that's not a major point.
I'd also encourage the conda-forge and Spack people to review PEP 517 (and 518) and provide feedback on distutils-sig so we can ensure pip 11 meets everyone's requirements. Also, please can they test current pip master (and the pip 10 beta when it comes out, but there won't be a long beta phase, so please don't wait for that!) to ensure there's no other nasty surprises awaiting us?
@adamjstewart The command that you're looking for is (though you'll need the flag that we're going to add when pip 10 comes out):
python -m pip install --no-deps --ignore-installed .
@pfmoore Yeah, it's a simple fix.
PS: It's 2am for me here and I have an examination on IoT, in about 7 hours or so. I'll shoot up a PR today (in my timezone). =)
I'd also encourage the conda-forge and Spack people to review PEP 517 (and 518) and provide feedback on distutils-sig so we can ensure pip 11 meets everyone's requirements. Also, please can they test current pip master (and the pip 10 beta when it comes out, but there won't be a long beta phase, so please don't wait for that!) to ensure there's no other nasty surprises awaiting us?
+1
Please do.
I'm confused with this issue, especially due to the lack of a description.
With Nix we use first a command to build a wheel, which in the case of the setuptools builder is python setup.py bdist_wheel
, and then in the generic install phase we install it with pip install *.whl
. In both the building and installing phases the dependencies are already "installed" and on PYTHONPATH
.
pip 10 will fetch build dependencies from pypi, even if dependencies are available locally. Do you understand?
Why?
Let me rephrase. pip 10 will create another "conda environment" composed of exclusively pypi packages, and then run the build inside of that.
I see. It's understandable trying to build such a sandbox, but for Nix we're not interested in it.
I'll need to read 517/518 again
@FRidh What is happening is that this is in preparation for PEP 517, which says
A build frontend SHOULD, by default, create an isolated environment for each build, containing only the standard library and any explicitly requested build-dependencies.
Package build dependencies are setuptools and wheel (currently - under PEP 517, projects will be able to specify that, for example, they use flit rather than setuptools), plus any packages that need to be available for the project to build from source. Currently, support for such extra packages is provided by setuptools (setup_requires), but that is a problem, because it doesn't use pip to do the installs (the details are messy and rooted in history). In future, packages will be able to specify build dependencies in pyproject.toml
- that's basically what PEP 518 is about.
When installing dependencies for a build, pip doesn't want those dependencies to end up in the user's installation, because they haven't been requested by the user, and are only needed for pip's internal build step. So we install the dependencies in a separate area, and set up the build to run, as noted in that quote from PEP 517 above, in an environment that contains only the stdlib and those dependencies.
The problem is that this is a change for people who currently build projects from source that don't correctly specify their build dependencies in setuptools or pyproject.toml
, but rather simply document "you need XXX installed for the build to work". At present, they just have to do as the project documentation asks, but in future they won't be able to - because they won't be able to get at that isolated environment.
What we're proposing here is to add a flag to pip that says we should let the build also see the user's installed packages. That would be a quality of implementation change to how we implement PEP 517, allowing users (or in this case, repackagers) to opt out of the isolation that PEP 517 suggests is good practice.
Note that we understood there were not a lot of situations where this would be an issue (projects with build dependencies that don't specify them in the build code, but simply document what users need to have present). I gather that repackagers like conda, Spack and Nix may have different reasons for wanting to avoid build isolation, which makes such a flag more important.
Hope that clarifies.
PS I should also point out that one of the reasons for writing PEP 517 and 518 is so that other build frontends, focused on alternative use cases, could be written without having to reverse engineer what pip does. Longer-term it may be that a build system more suited for repackaging projects would be worth having, and the PEPs enable that, but obviously that's a much bigger task, and not something I think is likely to happen in the immediate future - so pip needs to make sure you guys are involved in discussions around features like this (hence the need to make sure you're aware of what gets discussed on distutils-sig).
Thanks for the clarification.
The way I see it Nix is considered a build frontend. We will run the build backends ourselves (python setup.py bdist_wheel
or flit wheel
) and use pip
only to install the eventual wheel, as we do already.
Eventually we're going to need a script that takes the relevant info from pyproject.toml
and converts it to fields in a Nix expression.
@FRidh Excellent - yes, that sounds like a great approach. Hopefully PEP 517/518 give you the infrastructure you need to make that job easier.
It sounds like python setup.py bdist_wheel
will not do what you want @FRidh. Please see this comment.
@jakirkham It's fine for setuptools-based projects. For flit-based projects you need flit build
.
If you want a general command that builds a wheel from any package, you need to follow the protocol in PEP 517. Pip is one implementation of that (which you're welcome to use) but you can do it yourself or use another tool (which sounds like what @FRidh is planning).
Woah. xoviat has... deleted his/her account. Huh.
Wow, never seen this happen before...
What? Wow, what on earth's going on?
What? Wow, what on earth's going on?
see https://github.com/scikit-build/scikit-build/issues/309#issuecomment-371648631
@jcfr thanks for providing an update - glad to hear it's nothing worse than that 馃槃
Thanks @jcfr for the pointer. =)
glad to hear it's nothing worse than that
+1
FYI (for those that don't already know) PR ( https://github.com/pypa/pip/pull/5053 ) adds an option to disable build isolation. Would be good to get some help testing it for those with time/interest.
Fixed via #5053
Thanks PyPA team for your consideration on this issue. We at Anaconda will be testing master with this and will let you know if we encounter issues.
cc @jjhelmus
@msarahan Brilliant - we're hoping to start moving forward on pip 10 pretty quickly now (this was one of the last blockers) so please let us know if you have any issues, so we can get them sorted before we release.
Most helpful comment
Yes, if pip doesn't care about the local environment, that will be hostile to the conda packaging ecosystem. It will hide errors in conda recipes and potentially use binary incompatible packages, depending on how wheels move forward with use of shared libraries.
Please add the option that @xoviat suggests here, or let us know and we'll do it.
Thanks @xoviat for raising the issue.