What's the problem this feature will solve?
It can sometimes be useful to install from a pyproject.toml
that has a list of install_requires = [] without having to build the whole package.
pip install -r pyproject.toml --key tool.enscons.install_requires
would open pyproject.toml, look up ['tool']['enscons']['install_requires']
, and install as if those items had been passed as arguments to pip install
, or written out to a requirements.txt file and then installed.
It wouldn't work correctly if the package referenced its own extras, unless a second key for the extras dict was passed.
If this feature was added it would be easy to support .json as well.
I'd rather support -r -
, which would then allow you to use some sort of TOML equivalent of jq
1 to read the values you want out and pass them in that way. That would be more general than a specific pyproject.toml
reading feature.
1 A quick search found https://github.com/jamesmunns/tomlq
Personally I tend to think all -r
usages on “non-locked” requirement specifications should be discouraged, since it does not fit well into the recent best practice trend of lock files. I would be in favour of introducing a new requirements file format, but pyproject.toml
should not be it; the lock file format should.
I'd rather support
-r -
I'm looking for something like this too.
My use case is making sdists in CI. pip wheel
with its PEP 518 support means I no longer need to install build dependencies (from a requirements.txt or otherwise) prior to making a wheel, and I don't need to play import-guarding tricks in setup.py to make sure it is runnable without build requirements.
That's great if I'm making a wheel, but the same problems exist for sdists, which remain unsolved. Even though I'm not building anything for an sdist, some of my projects still import cython
in their setup.py, they still use setuptools_scm
for versioning. So to make an sdist some of these requirements are needed, and I'd have to resort to tricks again to delay importing the others, or install all build dependencies before running python setup.py sdist
.
So what I would really like is a pip sdist
command that does the build isolation process and installs the 'build' dependencies but makes an sdist instead of a wheel. But a pip install -r pyproject.toml
would work pretty well too - no isolation but it doesn't matter on the CI server.
pip install -r pyproject.toml
is a bit nicer than
pip install toml && python -c 'import toml; c = toml.load("pyproject.toml")
print("\n".join(c["build-system"]["requires"]))' | pip install -r /dev/stdin
Which is what I'm currently considering putting in my CI.
@chrisjbillington for building sdists, you could try python -m pep517.build
from the pep517 project. Maybe one day pip or twine or some other tool will gain that capability, but in the meantime that should do the job.
Thanks @sbidoul, looks like I should use pep517 until the discussion about where the 'build an sdist according to PEP 517' tool belongs is settled.
Maybe pep517 should be renamed pyproject
and make an official entry point to python -m pep517.build
, as pyproject build
(Alas, pyproject
is taken on PyPI.)
(Alas, pyproject is taken on PyPI.)
Ah, unluckly. Even if the author were happy to relinquish it (it doesn't look active, though the author's other projects are), it'd be strange to re-use a package name for something completely different.
An alternative could be to include a renamed pep517 it in the standard library, or ship it with pip. Then it wouldn't need a PyPI name.
(Alas,
pyproject
is taken on PyPI.)
That's definitely an invalid project. Filed https://github.com/pypa/pypi-support/issues/417.
It looks like a real project to me, and seems to predate the PEPs introducing pyproject.toml
so doesn't look like intentional name-squatting. It's just unmaintained.
This works pretty well FWIW...
pyproject.toml
:
[install_requires]
django = "*"
[extras_require.dev]
django-debug-toolbar = "*"
[build-system]
requires = ["setuptools", "wheel", "toml"]
setup.py
:
from os.path import dirname, abspath, join
from setuptools import setup, find_packages
import toml
with open("pyproject.toml", "r") as f:
requirements = toml.loads(f.read())
prod = requirements['install_requires']
dev = requirements['extras_require']['dev']
setup(
install_requires=[x + prod[x] if prod[x] != "*" else x for x in prod],
extras_require={'dev': [x + dev[x] if dev[x] != "*" else x for x in dev]},
)
Then just pip install -e .[dev]
as usual. I landed on this solution after losing patience with the slowness of pipenv
.
Please do not use non-standard top-level keys in pyproject.toml
. All top-level keys except tool
are reserved for future use by Python packaging, according to PEP 518.
Most helpful comment
I'm looking for something like this too.
My use case is making sdists in CI.
pip wheel
with its PEP 518 support means I no longer need to install build dependencies (from a requirements.txt or otherwise) prior to making a wheel, and I don't need to play import-guarding tricks in setup.py to make sure it is runnable without build requirements.That's great if I'm making a wheel, but the same problems exist for sdists, which remain unsolved. Even though I'm not building anything for an sdist, some of my projects still import
cython
in their setup.py, they still usesetuptools_scm
for versioning. So to make an sdist some of these requirements are needed, and I'd have to resort to tricks again to delay importing the others, or install all build dependencies before runningpython setup.py sdist
.So what I would really like is a
pip sdist
command that does the build isolation process and installs the 'build' dependencies but makes an sdist instead of a wheel. But apip install -r pyproject.toml
would work pretty well too - no isolation but it doesn't matter on the CI server.pip install -r pyproject.toml
is a bit nicer thanWhich is what I'm currently considering putting in my CI.