I am working on a Python package with pipenv and am faced with the challenge of keeping setup(install_requires=...)
in sync with my Pipfile's runtime dependencies. Is there a recommended approach?
[Answer 2019-08-23] Best practise as also discussed below:
For applications that are deployed or distributed in installers, I just use Pipfile.
For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires.
Then I make my Pipfile depend on setup.py by running
pipenv install '-e .'
.
Does pipenv have a python API that could be used? I manually update the list as I work on a project but the following could be nice:
from setuptools import setup
from pipenv import find_install_requires
setup(
# ...
install_requires=find_install_requires()
# ...
)
the function just needs to return a list of the keys in the pipfiles [packages] section. I imagine you could achieve this functionality already using a helper function, but it'd be nice if it was part of pipenv so we don't all have to implement it.
Pipfile, the implementation backing Pipenvâs Pipfile parsing, can help with this:
import pipfile
pf = pipfile.load('LOCATION_OF_PIPFILE')
print(pf.data['default'])
But I wouldnât recommend this, or depending on Pipenv in setup.py. Importing pipenv
(or pipfile
) means the user needs to actually install that before trying installing your package, and tools like Pipenv trying to peek into it without installing (setup.py egg_info
) wonât work. The setup.py should only depend on Setuptools.
A middle ground solution would be to write a tool similar to bumpversion that automatically syncs a text file based on Pipfile. Distribute this file with your package, and read it in setup.py. Then use CI or a commit hook to make sure the files are always in sync.
Yeah good point ignore me.
Perhaps âpipenv installâ could do the sync?
On Mon, 8 Jan 2018 at 5:04 pm, Tzu-ping Chung notifications@github.com
wrote:
Pipfile https://github.com/pypa/pipfile, the implementation backing the
parsing, can help with this:import pipfile
pf = pipfile.load('LOCATION_OF_PIPFILE')
print(pf.data['default'])But I wouldnât recommend this, or depending on Pipenv in setup.py.
Importing pipenv (or pipfile) means the user needs to actually install
that before trying installing your package, and tools like Pipenv trying to
peek into it without installing (setup.py egg_info) wonât work. The
setup.py should only depend on Setuptools.A middle ground solution would be to write a tool similar to bumpversion
https://github.com/peritus/bumpversion that automatically syncs a text
file based on Pipfile. Distribute this file with your package, and read it
in setup.py. Then use CI or a commit hook to make sure the files are always
in sync.â
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/pypa/pipenv/issues/1263#issuecomment-355889369, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ALMBlmmV52kdIL9D4zlMJoQh2JpaGdDbks5tIa_jgaJpZM4RRu3v
.
@uranusjr just testing my assumptions here, but wouldn't it be possible to add pipenv to setup.py's setup_requires, and delaying the pipenv import to a setuptools Command? Or is that considered bad practise?
@Korijn It might not be per se, but given the current best practice is to use separate virtualenvs for each Python project, this would require the user to install a copy of Pipenv for each project, which is not very intuitive. Pipenv should only be installed once (usually globally), and is used outside the projectâs virtualenv to manage it, not inside the projectâs virtualenv.
So what's the resolution to this that led to the issue's closure? Is there no means of keeping track of both the dependencies in the Pipfile
and in setup.py
? Is there a best practice that circumvents the issue?
For applications that are deployed or distributed in installers, I just use Pipfile.
For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires.
Then I make my Pipfile depend on setup.py by running pipenv install '-e .'
.
[Update 2019-08-23] I keep the dev packages in Pipfile nowadays, only the runtime dependencies get to live in setup.py.
I thiink @Korijnâs approach is best practice here. Pipfile (and requirements.txt) is for applications; setup.py is for packages. They serve different purposes. If you need to sync them, youâre doing it wrong (IMO).
@uranusjr Not according to the documentation.
Pipenv is a tool that aims to bring the best of all packaging worlds (bundler, composer, npm, cargo, yarn, etc.) to the Python world. Windows is a firstâclass citizen, in our world.
It automatically creates and manages a virtualenv for your projects, as well as adds/removes packages from your Pipfile as you install/uninstall packages. It also generates the everâimportant Pipfile.lock, which is used to produce deterministic builds.
Maybe I'm just not getting it. Could you elaborate on your statement please?
The way I understood it is that pipenv
is a complete dependency management system similar to composer for PHP, but I'm beginning to realise that it isn't. Especially as pipenv
won't install the dependencies of a dependency that has a Pipfile and Pipfile.lock, but no install_requirements in setup.py.
@vascowhite the question youâre asking isnât about pipenv but rather is about a fundamental separation between python packaging tools. In the python workflow, setup.py
files are used to declare installation dependencies of a distributable package. So, if I have a package like requests
, and it depends on people having cffi
installed, I would declare that in my setup.py
so that when people run pip install requests
it will perform that install if necessary as well.
Pipfiles, like requirements files, are not meant to be traversed recursively. Instead there is a single pipfile which rules over all of the dependencies for a project you might be developing. The point of this is that the old workflow generated a flattened list of pinned requirements, while Pipfiles contain top level requirements and prefer unpinned where possible. When you install a package, the requirements from itâs setup.py
are recursively resolved to the best match that also fits your other requirements.
So if you want to know why Pipfiles arenât recursively resolved, itâs because thatâs just not how they are used in python. Running pipenv install
ultimately requires a target that is installable by setuptools
, which means it will have its install requirements defined in its setup file.
@techalchemy I was half-way through a similar response before yours popping up đ (delete everything)
I would like to also note that @vascowhite what youâre asking is not in fact outlandish. With Pipfile and the lock file both being available, it is possible to reconcile the two distinct workflows. In an ideal world, Pipfile replaces setup.pyâs install_requires
, and be used to specify virtual dependencies, while the lock file is used to produce a concrete dependency set based on it, replacing requirements.txt.
Pythonâs packaging system, however, is far from ideal at the present time, and it would require a lot of cleanup before this can ever happen. Heck, Pipenv is already having difficulties handling dependencies right now (p.s. not anyoneâs fault), it would probably barely work unless for the simplest of projects if used like that.
The hope is not lost though (at least not mine). Thereâs been a lot of PEP being proposed and implemented around this issue, and I feel things are on the right track with setup.py and requirements.txt both moving toward a rigid, declarative format. With an ecosystem so large, things need to move slowly (or see Python 3.0), but are indeed moving.
@techalchemy @uranusjr
Thank you both for your clear answers, they do straighten a few things in my mind. It does seem to me that the documentation is over stating what Pipenv is able to do and that is partly the cause of my confusion. The majority of my confusion is down to me though :)
Having come from PHP I have been confused by packaging in python, Composer is a breeze in comparison. I do find python much easier to develop in and love using it. Let's hope things improve, I'm sure they will given the efforts of people like yourselves and Kenneth Reitz.
If you stick to my advice mentioned above, you can perfectly harmonize both setup.py and pipenv. No need to get all fussy. :)
looks like I'm not the only one that's confused #1398
Put much better than I could though :)
Came here for info on using pipenv
with a setup.py
; adding my .2 cents to the discussion.
I have a python package which setup.py
looks like:
setup(
name='my-pkg-name',
packages=find_packages(),
install_requires=[...],
extras_require={
'develop': ['click']
},
entry_points={
'console_scripts': [
'my-pkg-name-cmdline = my-pkg-name.cli:tool'
]
}
As you can see I use click
in the script entrypoint for tasks such as building and deployment.
When I run $ my-pkg-name-cmdline build
I don't find click
installed, because pipenv install --dev
installs packages in the pipenv virtualenv. I need to fiddle with pipenv shell/exit
in order to make it work. Looks like there are still some rough edges on this.
Therefore +1 for not using pipenv
for packages.
I think you are expected to call pipenv run my-pkg-name-cmdline build
in that scenario.
@Korijn I'm still not sure about the correct workflow (still experimenting a bit with pipenv).
As of yet, the workflow that seems to be working for me is:
(starting from scratch)
1- pipenv --three
2- pipenv install [--dev]
3- pipenv install -e . (install application locally)
4- pipenv shell (to enter the virtualenv)
Now I can run my package build click
script from the command line.
if I enter into the virtualenv (step 4) before installing the application locally (step 3), it does not work.
Perhaps I just have to rewire my brain into remembering that packages should be installed before pipenv shell
(while using virtualenv
requires you to install packages with the virtualenv activated).
@apiraino I think youâre not getting thing right here. If you want to use (import) click in your package, you should put it in install_requires
instead, so people (including yourself) installing your package can have click installed as well. Putting it in extras_require['dev']
means itâs an optional dependency, i.e. your package can work without it, but installing those extras can provide certain extra features.
This discussion really does not have anything to do with Pipenv anymore. I suggest you bring this problem to a more suitable forum, such as StackOverflow or Python-related mailing lists.
@Korijn pipenv install '-e .'
yields a Pipfile
not reflecting the modules listed under install_requires in setup.py
This is still the case for pipenv 9.0.3.
How can I generate Pipfile
from my setup.py
's install_requires?
dont use quotation marks
I stopped using quotation marks. However, I don't get a Pipfile
created that includes the deps from install_requires
section of setup.py
.
@benjaminweb I was confused by the same thing today. However, I'm starting to think that the current behavior may correct.
@techalchemy mentioned above that
Pipfiles contain top level requirements and prefer unpinned where possible. When you install a package, the requirements from itâs setup.py are recursively resolved to the best match that also fits your other requirements.
If you use the workflow mentioned in https://github.com/pypa/pipenv/issues/1263#issuecomment-362600555, when you run pipenv install '-e .'
on a project without an existing Pipfile, pipenv generates a new Pipfile with the following:
[packages]
"e1839a8" = {path = ".", editable = true}
In this case, the only package you explicitly requested to be installed into the virtualenv is the package itself (i.e. "."), so it makes sense that only "." is added to the ([packages]
) in the Pipfile. Similarly, if you pipenv install requests
, none of the install_requires
depencencies from requests's setup.py are added to your project's Pipfile either.
However, when the package installation step happens next, the install_requires
dependencies will be installed as part of the dependency resolution for the package.
Note that unlike the Pipfile, the Pipfile.lock records all the exact dependencies for the entire virtualenv, which has to include the install_requires
dependencies locked to specific versions. If you look at the Pipfile.lock that's generated, you'll see the install_requires
dependencies listed.
It's possible I'm totally misunderstanding how this is expected to work. Maybe @techalchemy or @uranusjr can confirm if this is the correct way of thinking about this?
Your line of thinking matches mine. Iâll also mention that with recent Setuptools advancement and tools such as Flit you can still specify your packageâs dependencies in nice TOML form (instead of requirement strings in setup.py, which is admittedly not very pretty). You just specify them in pyproject.toml instead Pipfile.
@uranusjr it sounds like what you're saying is that Pipfile only needs to explicitly list project dependencies if they are not already being captured by a packaging tool like Setuptools or Flit (via setup.py or pyproject.toml)
For example, if setup.py looks like:
install_requires=['A'],
extras_require={
'dev': ['B'],
},
Then the Pipfile only needs the following:
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
[packages]
"e1839a8" = {path = ".", editable = true}
[dev-packages]
"e1839a8" = {path = ".", extras = ["dev"], editable = true}
Running pipenv install
would install dependency A for production, and pipenv install --dev
would install dependencies A & B for development.
If someone is already using Setuptools or Flit, is there ever any reason why dependencies should be added into the Pipfile under [packages]
or [dev-packages]
? Looking at Requests as an example, it's not obvious to me why the development dependencies are listed explicitly in Pipfile under [dev-packages]
, but theinstall_requires
and test_requirements
dependencies are all captured in setup.py.
It seems like the only reason why you would need to add dependencies explicitly to Pipfile is if you're not using Setuptools or Flit. Is this correct? Are there reasons why this is not true?
I think it's just personal preference. Listing dev dependencies in extras_require[dev]
is merely a convention; dev-packages
OTHO is a well-defined key. extras_require[dev]
would also enable any user to pip install package[dev]
, which maintainers may not like. I can understand people preferring one of the other.
As for packages
, no, there really isn't a scenario it makes more sense than install_requires
IMO. I'm sure people will come up with creative usages though.
Why is this issue closed?
@JulienPalard it's closed because of a few reasons:
setup.py
files are not really meant to be kept in sync, per se, I think there have been a bunch of articles linked discussing the details but tl;dr Pipfile
is roughly equivalent to a top level requirements.txt
setup.py
resolved into the managed virtualenv, the workflow would just be pipenv install -e .
-- this puts a single entry into your Pipfile
(the top level root) and the resolved dependency graph into your Pipfile.lock
including the resolved install_requires
. If you want to update the virtualenv with the latest contents due to a changed install_requires
, you'd have to run pipenv update
which is the same as pipenv lock && pipenv sync
in the latest version.Hope this is helpful!
Actually they are more similar than Pipfile
is similar to requirements.txt
: requirements.txt
specifies all the packages in a flat manner while Pipfile
& setup.py
requires only the entry level dependencies. Pipfile.lock
& requirements.txt
contain similar info.
I created a POC sync script that can be further implemented but currently covers our use case:
https://gist.github.com/iddan/f190c3c7d54f4fc4655da95fb185e641
@iddan that's essentially what I've been saying, each of these things represents a top level listing of your dependencies, but one is meant for _installing an application_ (setup.py
) and the other is meant for _developing it_ (Pipfile
).
In the former case of using setup.py
, you have the same options to declare open-ended or fully-pinned dependencies as you would with requirements.txt
, but most people use open-ended dependency pinning here. In a Pipfile
you can specify strict or loose pins as well, but it is similarly encouraged to keep this as an _unflattened_ listing of your dependency graph. Again, I want to stress that both of these things are also completely valid in requirements.txt
files, which is why I am continuously emphasizing the separation of responsibilities between applications and libraries. You will hear this emphasized in every talk and every tutorial and in all messaging that is put out by the PyPA.
Pipfile.lock
and requirements.txt
are not the same thing. You can generate a valid requirements.txt
from a Pipfile.lock
, but you cannot directly generate a lockfile from a requirementsfile without using an intermediary Pipfile
. That is because a Pipfile.lock
represents a transitive closure and will always require an intermediary Pipfile
in order to perform dependency resolution.
As to the original question, there is no reason to keep a setup.py
file in sync with a Pipfile
other than by simply including the directory in question as an editable path. When you run pipenv lock
, the dependencies in the setup.py
file will be automatically resolved. I'm not sure about your specific use case, @iddan, or why you need to have a special parser to write things back to the setup file, but I suspect you may want to read Donald Stufft's article on setup.py vs requirements.txt or to reference this comment by one of our maintainers about the distinction and how it applies specifically to Pipenv.
My use case is that at K Health we have a repos with our internal packages which are standalone services and also can be consumed as packages. So we'd like to share our top level dependencies between the package consumers and the dev/deployment config of the service. Since we are using Pipenv to manage our dependencies it will be nice to get an output of the Pipfile
as a setup.py
Sounds like a variant of #2148 (replacing requirements.txt with Pipfile).
But this is regarding setup.py
This. Issue. Should. Not. Be. Closed.
This issue is closed because
If you really care this much, please, make a tool yourself. With so many anticipation in this issue, I believe itâd not be difficult to gain traction if you post your solution here. And with traction, PyPA can recommend it in the packaging guide, just like Pipenv. But first you need to actually build the tool.
Also please learn to address project maintainers respectfully. See the code of conduct for reference. We are happy to have productive discussions but we are volunteers and we are not here to simply obey individual wishes. If you canât manage that, donât bother posting.
I would recommend locking this issue to prevent further discussion. I think the point has been made clearly.
Do one thing and do it well!
Hi everyone,
@Korijn i read that part where you were explaining how you use extra_requires to sync setup.py with Pipfile.
I was trying to do that and noticed that Pipfile.lock has not the extra_require packages in dev section, so when you have the source code in an empty venv and do pipenv install --dev, as Pipfile.lock does not have extra_require requirements pipenv only installs the packages on install_require.
setup.py
import os # noqa: D100
from setuptools import setup, find_packages
def read(fname):
"""Read a file and return its content."""
with open(os.path.join(os.path.dirname(__file__), fname)) as f:
return f.read()
setup(
name='auditor_client',
version='0.0.0',
description='Auditor client package',
long_description=read('README.md'),
packages=find_packages(exclude=['tests']),
install_requires=['requests==2.9.1'],
extras_require={'dev': ['flake8', 'flake8-docstrings', 'pytest', 'coverage', 'tox']},
setup_requires=["pytest-runner"],
tests_require=["pytest"]
)
Pipfile
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[packages]
"e1839a8" = {editable = true, path = "."}
[requires]
python_version = "3.6"
[dev-packages]
"e1839a8" = {editable = true, extras = ["dev"], path = "."}
Pipfile.lock
{
"_meta": {
"hash": {
"sha256": "e58b833e497814c83a2f0b93ad21d33a2af8b72721b20e9607a6c9135978422d"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.6"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"e1839a8": {
"editable": true,
"path": "."
},
"requests": {
"hashes": [
"sha256:113fbba5531a9e34945b7d36b33a084e8ba5d0664b703c81a7c572d91919a5b8",
"sha256:c577815dd00f1394203fc44eb979724b098f88264a9ef898ee45b8e5e9cf587f"
],
"version": "==2.9.1"
}
},
"develop": {
"e1839a8": {
"editable": true,
"path": "."
},
"requests": {
"hashes": [
"sha256:113fbba5531a9e34945b7d36b33a084e8ba5d0664b703c81a7c572d91919a5b8",
"sha256:c577815dd00f1394203fc44eb979724b098f88264a9ef898ee45b8e5e9cf587f"
],
"version": "==2.9.1"
}
}
}
should not be the correct behaviour that Pipfile.lock has tracked extra_require dev packages?
Yes, this looks like a bug/limitation to me. You should file a separate bug/issue for this.
I think there is an issue opened in the tracker about this problem, although I cannot locate it right now. Please do search existing issues before opening one. Thanks in advance :)
This is not a bug, you cannot use the same base entry multiple times in a pipfile. If you specify a dependency in the dev
section and also in the default section, the default section takes precedence no matter what.
I would walk through my normal thought experiment but I donât have time just now so just take my word for it that it could cause dependency conflicts and surprises when you deploy something and find out your dev dependency was hiding a conflict.
@techalchemy so how can I manage my dev dependencies in this case? I only want to know how to use pipenv in a good way
Iâve been thinking about this for my own project, and kind of came to realise I donât really need the packages
/dev-packages
distinction. How about listing {editable = true, extras = ["dev"], path = "."}
in packages
.
check out this pipenv-setup package
It syncs pipfile/lockfile to setup.py
$ pipenv-setup sync
package e1839a8 is local, omitted in setup.py
setup.py successfully updated
23 packages from Pipfile.lock synced to setup.py
you can do $ pipenv-setup sync --dev
to sync development dependencies to extra requires. or $ pipenv-setup sync --pipfile
to sync pipfile instead
and $ pipenv-setup check
to do checks only
one command to solve them all đŻ
Is there any plan to merge pipenv-setup package to pipenv?
@peterdeme
Is there any plan to merge pipenv-setup package to pipenv?
@uranusjr @techalchemy based on the discussion above, I think pipenv might have a somewhat different philosophy. But If the maintainers agrees, I'd very like to submit a pull request and try to integrate pipenv-setup
You can always parse the Pipfile.lock
with the builtin json
module. Extract the non-dev
dependencies for your setup.py
install_requires
.
The "default"
key contains nested "dicts" of the package name along with version
numbers and markers
.
You don't need to rely on any external imports.
@Kilo59 I've seen people doing this. A tip to mention is don't forget to include Pipfile.lock as data_file in setup.py (or include it in MANIFEST.in). And that's for lockfile with pinned dependencies. pipfile, on the other hand, is non-trivial to parse, if you want semantic versioning in Pipfile. The same dependency requirement can show in multiple forms.
Thank you @Madoshakalaka your tool works nicely!
I agree with other peers that Setup.py's dependencies are different from Pipfile's project dependencies. But still, having a programmable way to sync those without manual labor is a great time saving feature. Also, avoids typos/common errors.
The blackened setup.py was a nice touch too :+1:
Most helpful comment
For applications that are deployed or distributed in installers, I just use Pipfile.
For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires.
Then I make my Pipfile depend on setup.py by running
pipenv install '-e .'
.[Update 2019-08-23] I keep the dev packages in Pipfile nowadays, only the runtime dependencies get to live in setup.py.