Pipenv: Another toml section for testing

Created on 4 Jun 2018  Ā·  26Comments  Ā·  Source: pypa/pipenv

I'd like to know if there is away to have [test-packages] in Pipfile
It would be good if I have something like

[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"

[packages]
django = "*"

[dev-packages]

[test-packages]
pytest = "*"

[requires]
python_version = "3.6"
Discussion

Most helpful comment

Their is no "standard" way of doing things here.

That's the problem, isn't it? pipenv is trying to simplify packaging in the Python world and is trying to be a standard. But leaving major use cases behind is kinda what got us to what the state of python packaging is today.

All 26 comments

This has been raised multiple times, but I cannot find the old issue atm so Iā€™m leaving this open for now.

The essential question anyone proposing this feature should answer is why. Due to the locking nature of the Pipfile-lockfile combo, every section added would induce considerable increase of difficulty in implementation because Pipenv would need to consider the interaction between sections. It is therefore not enough to propose this based simply on aesthetics; you need to propose a real, practical, insurmountable need for this feature to convince the developers to consider it.

(Or, alternatively, you can implement it yourself, promise to shepherd the feature and fix problems it introduces.)

Thanks for the reply. I was first asking if this feature is there or there is another alternative, then ofcourse I'll answer the question _why?_
Here I'd try to summary my points and reasons why I need this

  1. Following Single Responsibility Principle. Having organized Pipfile makes it more readable and maintainable. Now I've four requirements file. base.txt, test.txt, dev.txt, production.txt

production.txt

-r base.txt

# WSGI Handler
gunicorn==19.7.1

test.txt

# Test dependencies go here.
flake8
fake-factory
factory_boy
tox
pytest
pytest-factoryboy
pytest-cov
pytest-flake8
pytest-faker
pytest-isort
pytest-env

dev.txt

# Local development dependencies go here
-r base.txt
-r test.txt

pudb
django-pdb
ipython
ipdb

# debug
werkzeug

moving to pipenv will make me put those either on dev or in or default where something like ipython, werkzeug for example is only related to dev not testing.

  1. I'm now working with three docker environments (dev, testing, production). For CI, I see it is better to have different sections. This will reduce the docker image installing part of the requirements not all.

I agree with you it would add a difficulty overhead if it is implemented this way. However, I see that pipenv should keep the same behaviour while introducing more options as follows:

  • pipenv install --test-packages this will install [packages] and [test-packages]
  • pipenv install this will install [packages] and [dev-packages]

I do not think it will add more complexity to Pipfile.lock.

I'd love to contribute and do it myself, but I'd like to initiate a discussion like this before starting.

Thank you

Thanks for the clarification. I assume you meant

  • pipenv install --test this will install [packages] and [test-packages]
  • pipenv install --dev this will install [packages] and [dev-packages]

would this be correct? (Option names are up for discussion. The point is that you need --dev to explicitly install [dev-packages]; the default is to only install [packages].)

The question this brews would then beā€”Should it be possible to allow users to install all three sections? If noā€”what answer should we provide when users (inevitably!) come requesting this feature? And if yesā€¦

How are we going to separate packages in the lockfile so it can cover all possible use cases:

  • Only [packages]
  • [packages] + [dev-packages]
  • [packages] + [test-packages]
  • [packages] + [dev-packages] + [test-packages] (also, how do you order the latter two?)

Should this be generalised to include arbitrary numbers of sections? I can easily imagine people come asking for [doc-packages], [staging-packages] and so on. Warehouse, the project for the currently active pypi.org site, has eight requirements.txt variants.

Syntactic-wise, this could be better:

[packages]
# ...

[dev-packages]
# ...

[extra-packages.test]    # This "extra-packages" prefix signifies a custom section.
# ...

[extra-packages.doc]
# ...

Note: Dots in a TOML section name signifies a subkey, so in Python this would be parsed into

{
    'packages': {...},
    'dev-packages': {...},
    'extra-packages': {
        'test': {...},
        'doc': {...},
    },
}

This way you can have arbitrary keys under extra-packages, and we can simply look it up.

The syntax installing an extra section would be pipenv install --extra=<subkey>.

But then weā€™ll go back to the previous section combination question, but instead of 4 you now have a lot more combinations to consider.

I need to clarify that I am totally on board for this feature. All thatā€™s stopping me from doing it myself is that I have yet to come up with a satisfying design that is general enough without immensely complicating the implementation.

Kenneth has been pretty firm about only having dev and non-dev as sections, modeled after composer. If you want to complicate pipfiles it is going to take a lot of work on that front. Consider the cost of a new user trying to understand what you are accomplishing, how we would explain this to them, and what it gains for you. Pointing at tools that bootstrap the packaging ecosystem itself will always be more complex examples

@uranusjr Yes this is what I meant. I found a closed issue regarding this discussion
https://github.com/pypa/pipenv/issues/726 from 6 months ago.
Yes I'm totally agree that we need to answer how to handle a combinations and design something clear.
I do like your idea of having extra-packages as well.
My point for now, is we can stick to two combinations [packages] + [extra-<subkey>] and maybe adding multiple lock files. It is just an idea !

@techalchemy I'm quoting Kenneth words in the discussion I found. So I think It is not that firm

yeah, that's the idea, it's not a big deal. The fact that the composer community has been able to
pull this off gives me faith that we'll be able to as well.

keep in mind, the pipfile spec allows for other groups to be added in the future, so we can add
this later, if demand is really very high, but so far,  it's been smooth sailing.

My point is that since Pipenv is being official and meant to be Workflow for Human, It would be good to support such use cases. I saw also more use cases in this discussion #726.

Right, Iā€™ve worked pretty closely with him on this so Iā€™m aware of his position. Itā€™s important to continue to accept and receive feedback, but to be clear we donā€™t _have_ to add more sections until there is a compelling design reason or until it is clear that we can no longer support standard workflows. We are open to that, but personally Iā€™m much more interested in simplicity than attempting to capture every edge case.

In this case what I mean by simplicity is simplicity of the interface. Simplicity of things that are built with the tooling to say, ā€˜here you go person who is new to python, you can incorporate my pipfile into your workflow the same as you were using your own without stumbling over custom section namesā€™. Until there is a compelling case for changing this I donā€™t see a reason to

Regarding the for humans tag line: itā€™s a design philosophy. Quoting it wonā€™t convince anyone that your feature request is good. It comes across as a tactic used in attempt to make us feel bad for not doing what you want. Whether intentional or not, this is not an effective method for influencing design decisions. Just because you like something doesnā€™t mean we have to support it to make a good tool. We can discuss it, but you have to understand that our user base is large and the decisions we make impact thousands of people.

I'm sorry if I've been understood incorrectly. I did not meant it to be that way and I do understand that the impact of the decisions reflects thousands of people. Furthermore, I'm 100% agree with you of all the mentioned point.

Again, please accept my apologize for not being clear or being understood incorrectly.

No worries, I just want to keep the discussion focused on whether this decision is a good idea for most people or added coimplexity for most people. I _feel_ it's the second one, but I am willing to be convinced also.

Thanks again.

I do understand your point. For me, I _believe_ there is a middle ground that we can provide this feature without adding much complexity.
@uranusjr suggested a file structure and I've added a suggestion. The current scenario is that running
pipenv install will install two packages.. we can keep the current behaviour and add another option let's say pipenv install --extra-<subkey> and this will generate another lockfile Pipfile-<subkey>.lock and install also only two sections [packages] + [extra-<subkey>].

There may be some complexities that I'm not aware of yet. But I _think_ it is a middle ground.

What do you think?

Thanks

One of the problems is that a single "test" section doesn't actually cover all bases. tox is a common tool used across the python community which handles dependencies across multiple environments: Different dependencies for python2, python3, even different library versions for different abstract environments.

In the Django community that's used a lot to specify testing across multiple versions of Django, each locked with different versions of several dependencies.

There's a tox-pipenv plugin being worked on but I fear it's going in the wrong direction. Related discussion here: https://github.com/tox-dev/tox-pipenv/issues/37

I think whatever the final solution is should either rely on a single package group per Pipfile, or an arbitrary amount of package groups per Pipfile. --dev can simply be a "dev" package group for convenience. But simply adding a third won't even solve the problem it advertises solving (which is "test environments").

And to be clear, these aren't edge cases. They're established patterns. Tox implements them and is popular because of that. In fact, in some cases, tox is used for non-test purposes just because of its ability to create the appropriate environment to run a helper script in (without full blown using Docker for it). This should really be pipenv's job.

Regarding added complexity: Well, where is the added complexity for most users, exactly? I'm sure there's added complexity in the pipenv codebase, but what exactly would change for current users (other than having additional flags they can use should they want to create more package groups)?

The right direction is to get tox aware of the Pipfile, TMHO.
What reason it would use another section than the ā€œdevā€ one ?

@gsemet How would you reproduce this in a single dev section then?

https://github.com/jazzband/django-debug-toolbar/blob/c74fef5b1253554c90728f33fa6adb820173596a/tox.ini#L2-L19

TOX is meant to describe precisely different environment.
Pipenv is meant to describe 1 environment.

To have both world, why not imagine a kind of inheritance where Pipfile describe the base for developer and specific packages can be frozen by tox.ini.
But most of the time you want pipenv for th main deleloper environment + unit test + test env+ Sphinx+ pylint + pep8,...
And tox for combinatory tests (different version of python interpreter, different version of some dependencies,...).
But it should come ā€œon top ofā€ pipenv.

@gsemet If pipenv is meant to describe 1 environment then we're kinda back to the original point: Why make the difference between dev and "main"?

I'm all for a --dev flag which automatically looks at a different environment (however that env may be defined: Different pipfile, w/e). But I think any solution that applies to the dev env should be usable for arbitrary environments. Maybe dev is a special case because it gets a shortcut flag. WDYT?

main = what I need for my app to run in production

dev = all other tools. For yapft, pylint, flake8, sphinx, pytest, etc, use dev-packages. Works great.

I do not need to have pylint, autopep8, pytest in production. pipenv is meant to handle these 2 environments pretty well. All other combinatories are out of the scope of pipenv and frankly, most of the time can be merged into the "other"/dev-packages.

I would be happy to have a tox with pipfile support, to enable use case such as:

  • as a developer, I develop on my main pipfile with the two sections (dev and main) using pipenv
  • I run a series of compatibility tests on my CI and locally using tox, for example against several SQL backend, using tox. tox would be able to read my Pipfile (not the Pipfile.lock), maybe inject some additional dependencies such as mysql, another environment for Postgres ... Or on several version of a given dependencies such as several version of the twisted backend and so on.

But the main dependencies description would be in Pipfile.

Do not forget pipenv is primarily meant for application, so on the environment as close as possible than the production.
I use pipenv for both application AND libraries, where I would be interested in having tox to test on several configuration.

@gsemet I understand the distinction between the dev and prod environments. But what I'm saying is that there may be multiple dev environments. Hell, there may be multiple prod environments!

Pipfile takes a stand that it has suport for both one prod and one dev env per Pipfile. This distinction ignores the (common) cases where there are more environments needed, such as the one described in this very issue. Adding support for a third environment (eg. a "test" env) doesn't fix the underlying issue of arbitrary environments not being supported.

Tox solves this really well. I'm all for continuing to use tox, but tox should be able to delegate environment management to pipenv. More generally, python developers shouldn't have to think about multiple ways of declaring their environments ("Okay, I'll put the prod stuff in Pipfile, the test stuff in tox.ini, ...").

i agree, in a sense that I, like others, follow an opinionated solution. i use pbr and i am pretty happy with my libs+app development pattern with pipfile, but some does not like it. But you (as any developer) needs to understand the difference between application and libraries, what pipenv is and is not, what tox is and is not etc. Their is no "standard" way of doing things here.

Their is no "standard" way of doing things here.

That's the problem, isn't it? pipenv is trying to simplify packaging in the Python world and is trying to be a standard. But leaving major use cases behind is kinda what got us to what the state of python packaging is today.

I'd like to see this as well. In the meantime, I'd appreciate any suggestions for working with pipenv as-is in this Stack Overflow question:
https://stackoverflow.com/questions/51061358/how-can-i-manage-more-than-2-package-groups-with-pipenv

@dresnick-sf Multiple pipfiles maybe?

@jleclanche Can you see a workflow that doesn't include copying and pasting packages from the main Pipefile to secondary ones? Without that, the secondary Pipfile.locks could have different versions of packages that are in the base Pipfile.lock.

I do not, and I'd like it solved as well. But there are cases where you do want versions to diverge.

I'm very sorry after I saw there was no support for extra environment šŸ˜ž

@gsemet

main = what I need for my app to run in production

dev = all other tools. For yapft, pylint, flake8, sphinx, pytest, etc, use dev-packages. Works great.

Here is what your "main" is in a Django project deployed with uWSGI:

prod_only = [uwsgi]
dev = base + [django-debug-toolbar, tox-pipenv, werkzeug]
main = base + prod_only

_Note that there are more dependencies for running tests, but they can be covered by a tox.ini section's deps = option (hence, in theory, no need to install them explicitly for development at all)._

Why do we need "base"?

Because uWSGI needs python-dev and build-essential installed (on Debian based machines), and we don't want to force developers to install those two packages for running the tests. Hence, we do good excluding this from the lowest common denominator. Otherwise, it presents an unnecessary hurdle for setting up a developer machine (e.g. for on-boarding new developers).

So, with Django you end up with at least 3 environments. Even when you're smart and move all test dependencies to deps = options in tox.ini. -- _This is a real use case._

Can this be resolved?

With Docker builds you can in theory work around this situation: Simply install the "production only" requirements in addition to "base". The former is then hard-coded in the Dockerfile.

However, this has a price: You lose all the locking mechanisms (i.e. you must manage pinning and upgrading manually). Not nice.

Does anyone see an actual solution? I'd be interested.

I agree this use case is at the limit. I got recently a similar case, on Windows I needed some packages that failed installing on linux. So yes, having additional, custom environments in Pipfile would be great. That should not be the "norm", but just available as an option.

I also agree with the use case. I need to install few packages in the staging server only while few in the testing and next few in the local development.
Having separate and organized dependencies list will be great.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

marc-fez picture marc-fez  Ā·  3Comments

konstin picture konstin  Ā·  3Comments

hynek picture hynek  Ā·  3Comments

erinxocon picture erinxocon  Ā·  3Comments

bgjelstrup picture bgjelstrup  Ā·  3Comments