It is possible to specify dependencies inside the Pipfile that are installed from conda instead of pypi?
Somethink like this:
[[source]]
url = 'https://pypi.python.org/simple'
verify_ssl = true
name = 'pypi'
[[source]]
url = 'https://repo.continuum.io/...'
verify_ssl = true
name = 'conda'
When I set up a new computer I install anaconda, then install some packages with conda install (e.g. numpy, scipy, cython, ...) and the remaining ones I install with pip install.
I was able to get pipenv working with conda but I do not know how I can define the dependencies from conda (Or if it is possible). Searching for an answer to my question I can only find people that mention that conda does not work with pipenv.
$ python -V
Python 3.6.3 :: Anaconda custom (64-bit)$ pipenv --version
pipenv, version 9.0.3Unfortunately, beyond the basic command line switch to make "system" packages visible from within the pipenv
managed environment, this isn't currently possible, as there's no Python level specification for explicitly declaring system level dependencies (whether those system level dependencies are being provided by conda, a Linux distro, etc).
An initial draft of such a specification was developed a few years ago (see https://github.com/pypa/interoperability-peps/pull/30/files), so the best path for anyone wanting to move this forward would be:
pip
, setuptools
, and the auditwheel
project to get a sense for what might be involved in actually implementing the proposal (auditwheel
is relevant as it already includes code for checking what external libraries an extension module relies on, pip
is relevant as it would need to actually implement checking those external dependencies, and setuptools
may need changes in order to correctly publish the revised metadata)Alternatively, the conda folks may be amenable to a feature request that teaches conda to read the Pipfile
format, using conda
packages where possible (overriding any source
declarations in the `Pipfile), and falling back to PyPI for everything else.
Thanks for the answer, I thought pipenv is a wrapper around other tools to simplify things.
So it would be possible to define a backend for a source
(default pip).
Your second suggestion is not what I want. I want to define the source for each package. Because for some packages with C/C++ code I prefer conda, but for pure python I prefer pypi (newer versions).
Then I will stay with my current workaround.
@boeddeker
I was able to get pipenv working with conda
Can I ask how exactly you did this and what exactly you mean by "working with". The most that I have been able to do is tell pipenv
to use the python in $ANACONDA_DIR/envs/ENV_NAME
to copy to a new pipenv
-created directory and start completely over installing all packages in my reqs to the new env.
I would LOVE to be able to tell pipenv
a specific directory, but I have not been able to do this yet. Is this what you have done?
After that we could just use a Makefile to automate the installation logics.
@xguse I think I didn't found the solution that you want. I meant with working, the new env was using the python interpreter from anaconda (I didn't explore it much because I stopped thinking about pipenv, when I heard that they do support conda as a second package manager).
I do not care about envs. I wanted to automate my installation process on a new machine (i.e. download anaconda and install all dependencies. Some dependencies from conda, some from PyPi, some from git and some from path with editable). So I didn't care about a copy to a new environment but it is for me important, that I can define the source of a package like numpy.
I believe conda packages for numpy, scipy, cython, etc. nowadays do include the .dist-info directories so that pip can see that they're installed, and use them for fulfilling wheel dependencies. (You still get into a mess if you try to upgrade those packages using pip, because pip and conda's idea about what packages are installed then gets out of sync. But as long as you never update those packages with pip, it should be OK.)
The more complicated case would be if you wanted to use conda to install, say, MKL or openssl, and have pip-installed packages use those. That's what would need the infrastructure and PEP that @ncoghlan is referring to, I think. But I'm not sure this is relevant to the OP's problem.
Thanks for the explanation.
I currently use a mix of conda and pip install. But I miss a way to describe it deterministic and easily maintainable. Currently, I have a file where I write many conda install ...
and the pip dependencies are inside a setup.py.
I was hoping that pipenv is a high-level tool, that can select the backend, e.g. pipenv install --backend=conda ...
.
Further, the lock file does not work with conda dependencies, so one main reason for me to use pipenv for reproducibility does not hold.
@boeddeker The current expectation is that combinations of pipenv
and conda
will be handled the same way as combinations of pipenv
and any other platform level package manager (such as dnf
, yum
, or apt-get
: by using a layered approach to environment specification, where --site-packages
gets passed to pipenv init
to allow access to system provided packages, without pipenv
actually managing those packages.
The fact that conda
is a cross-platform package manager rather than a Linux-specific one doesn't change that basic layered architecture.
conda
has a further main difference to the others: you don't need root rights.
I understand your argumentation, I also understand that the way I use conda
is a workaround and an ideal solution would be to get numpy from PyPi to be equal to the one from conda.
Your suggestions are good for a long-term solution but require much more effort to work.
Since pipenv
sits on top of pip I was hoping that the backend could be exchanged.
But I know, that allowing conda as backend can introduce many incompatibilities.
The proliferation of manylinux
wheels are making Conda less of a priority for us.
e.g. $ pipenv install scikit-learn
just works now, on almost any machine.
e.g. $ pipenv install scikit-learn just works now, on almost any machine.
Sorta - unless you need to install numpy against a different version of openblas or with mkl extensions, but wheels have definitely made things way better.
Also for me in bioinformatics, it is almost always the case that my project needs not only python libraries but r libs and straight compiled software. Not to mention that the Anaconda libs are usually faster. Conda is often the easiest way to get such projects provisioned in HPC environments.
Yeah, I don't think pipenv is the right tool for you if you're working on those projects, basically.
unless togglesitepackages works well enough for you.
Note that "pipenv
with a conda
backend" would be comparable to an arrangement along the lines of "pipenv
with an apt-get
backend": it's confusing different layers of the packaging stack in a way that doesn't make logistical sense.
However, the conda
folks are open to the idea of transparently managing wheel archives from PyPI as if they were conda packages: https://github.com/conda/conda/issues/5202
That approach would better respect the layering described in http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#platform-management-or-plugin-management, which points out that pip
(and hence tools based on it, like pipenv
) is a language specific plugin manager for Python runtimes, whereas tools like conda
are language independent component managers for the construction of larger software systems.
Agreed, Conda is a system package manager that happens to specialize mostly in Python packages.
@kennethreitz: Yes I think that I am coming to the same conclusion. I just really like y'all's api and wish I could use it. I may just need to adjust my project automation logic to use conda for non-python and numpy, MKL stuff and pipenv for pure python and my pypi served projects.
Do you see any glaring problems with invoking a conda env and then dropping into a pipenv subshell from inside the conda env?
edit: and building the pipenv env with the conda site-packages exposed to it.
I mean, there's nothing stopping you from:
$ pipenv --python /path/to/condaenv/python --site-packages
$ pipenv install
great.
"pipenv with a conda backend" makes perfect sense in principle. Conda (unlike apt-get) provides all the basic concepts that pipenv is built on: the ability to create isolated environments, resolve package requirements into specific pinned versions and install them in those environments, etc. Of course every project has to make hard decisions about how to spend limited resources, and my guess is that this is outside the scope of what the pipenv devs want or are able to prioritize right now, but the idea is coherent :-).
$ pipenv --python /path/to/condaenv/python --site-packages
Oh perfect! I tried to do --system-site-packages
and didn't bother to check -h
- whoops! :flushed:
The only issue with the --site-packages approach of course is that you end up with numpy that isn't backed by mkl.
I guess someone just needs to build a condaenv (or whatever it could be called). It is obviously out of scope of Pipenv though.
Here is my first answer on Stack Overflow:
https://stackoverflow.com/a/62256587/12686551
Thanks so much for the guidance! Any up vote is much appreciated for rep 馃憤
Most helpful comment
I mean, there's nothing stopping you from: