Is your feature request related to a problem? Please describe.
The pipenv run
is really handy. But it requires a good deal of work to actually make it work for situations where your current directory is not the same as where the Pipenv file is located.
Just for something like using it as a shorthand for running your python application that requires setting the PYTHONPATH
environment variable. It it is tricky to figure out what to actually set it to. And while it can be made easier by wrapping it in a shell script. Informing Pipenv where said script is located also becomes tricky
Describe the solution you'd like
The most simple solution I can think of would be having en environment variable (say, PIPENV_PROJECT_ROOT
). And could be used in places like the scripts section of Pipfile. And in scripts run from it.
Describe alternatives you've considered
For my problem it would also have been nice with a way to have Pipenv update the PYTHONPATH
environment variable to include a path relative to the project root. But just having the project root in the environment would likely solve more use cases
There is PIPENV_PIPFILE
which I think is basically /path/to/project/root/Pipfile
. I don’t know if it works, but it is worth a try.
Regarding PYTHONPATH
, I (personally) think it is a bad idea for a tool to modify it in any case, since it is the absolute last resort, and should be left available for the user. We should find another way to do this if possible.
Ah right. I actually noticed PIPENV_PIPFILE
and then forgot about it again when I was looking for a way around this.
But it is not solving the problem of running in-project scripts defined Pipfile
if they are referring to a relative path. So maybe it is more a question of always running those commands form the root of the project?
I'm +1 to setting this environment variable in environments.py
.
@kristofferpeterhansel that's the approach I take- run all scripts and commands from the same directory as the Pipfile (project root). ./scripts/myscript.py
or pipenv run myscript
I found everything becomes a _bit_ easier that way
that's the approach I take- run all scripts and commands from the same directory as the Pipfile (project root). ./scripts/myscript.py or pipenv run myscript
Well yes. But I am not sure I can think of a reason why pipenv wouldn't just always run the scripts with the project root as working directory to make this consistent.
My guess is that it would make things like pipenv run python -c "import os; print(os.listdir())"
very confusing. If there should be consistency, it is probably never changing the working directory, instead of always changing…
That is true. I guess it then comes down to what is the intended use for the [scripts]
block. And should it be tied into the command for running commands in the virtualenv.
If it is intended to be an easy way to run supporting tools (like, say, pytest or mypy). It would be much more practical to be able to do so in a way that does not require you to have the correct current working directory. Pipenv will find that anyway and it requires less work on the command line.
Things in the scripts section should already run relative to project root to emulate the behavior of other sub commands like install
and shell
I would think. That would be my expectation. Do they not behave this way?
No. They do not.
I have entries like these. That will only run if my CWD is in the project root
[scripts]
mypy = "mypy --config-file=mypy.ini src"
test = "pytest -vv src"
@techalchemy There’s a hack to not chdir
to project root specifically for run
, I don’t know why. https://github.com/pypa/pipenv/blob/bba2f38d/pipenv/project.py#L115
Hmm. So it was specifically added under the label of "improvement" (https://github.com/pypa/pipenv/commit/ef11b3e854da1df327bd16a6e9043190f92b8c8a). Maybe @kennethreitz can remember the reasoning?
Calling pipenv run <unaliased command>
doesn’t operate relative to the project root for obvious reasons. If you are in a subdirectory and want to run a script you would want the behavior that exists. We don’t need Kenneth to explain that.
Scripts however need to behave predictably. When you call a script you want it to run the same thing whether you are in the project root or some other directory, I would think. For example if calling a django projects manage.py
in the scripts section I would expect that to work with a path relative to project root whether I am currently relative to the file in the same way or not
Considering this a bit, changing the behavior from what it is would be quite difficult -- the best option as Kenneth indicated is likely to allow users to set a flag that says to respect the project root for scripts and go through some cycles -- if everyone is just setting the flag all the time for every project, then deprecation cycles will allow us to change this to the default
Do you think this flag can also affect the behavior of pipenv shell
? I'd love if pipenv shell
could retain the directory. My usecase is when the Pipfile exists one or two directories above. I very consistently run pipenv shell
, cd ./back/to/where/I/was
right now
+1 on the separate behavior for pipenv run ./myscript
vs pipenv run aliased_command
that you stated above, even if its behind a flag
I wouldn't have any problems with the approach suggested by @techalchemy. Would solve my use-case just fine
I think the following is related. I've created a CLI tool; for the sake of example let's say all it does is ls
(list files/folders in CWD). I use pipenv
while developing it, and want to be able to run it anywhere. I came up with the following bash script that achieves what I want:
current_dir="$PWD"
source_dir="/path/to/tool"
cd "$source_dir"
. "`pipenv --venv`/bin/activate"
cd "$current_dir"
python "$source_dir/tool.py" $@
Is there currently an easier way to do this? If not, could something like the following work?
pipenv --project /path/to/tool run tool.py $@
This is probably simpler
pushd /path/to/tool
PYTHON="$(pipenv --venv)/bin/python"
popd
"$PYTHON" "/path/to/tool/tool.py"
I've realised something even more simple that works well. If your script is:
You can simply run pipenv --py
to get the path to venv's python, and then set it in the script's shebang line:
#!/home/user/.local/share/virtualenvs/dir-hash/bin/python
"""Do stuff in venv while keeping original working dir
Also works if symlinking to this script..."""
I also created a script that generates the equivalent of a symlink for python scripts that require a pipenv, for situations where one doesn't want to add a shebang to their script and/or it is public and used by others too.
https://gist.github.com/shadow-light/a64bab7243f3e4f759ef7e1b203e97e2
FWIW, my use case is as follows:
project_root
- ansible/
- build-shiv-app.sh
- Pipfile
When running Ansible commands to deploy, I first need to run the build script. The build script contains commands that need to be ran with dependencies from Pipfile. I'm almost always inside the ansible
directory when running ansible commands.
So, ideally, I'd be able to do:
# ansible/deploy.yaml
tasks:
- name: build executable with shiv
command: pipenv run build-shiv-app
delegate_to: localhost
# Pipfile
[scripts]
build-shiv-app = "bash build-shiv-app.sh"
# or something like
build-shiv-app = "bash $PIPENV_PIPFILE_DIR/build-shiv-app.sh"
Hello @uranusjr and @techalchemy, I had a related question I thought was OK to ask in this thread. First of all, thanks for the amazing work you're doing with Pipenv. Having experienced simpler workflows with ruby and bundler and similar tooling, trying to figure out a good pattern to follow for python apps is turning out to be a daunting task, so it's great to see progress in simplifying/unifying things here.
My question is this: in all of the discussion above, as well as others (e.g. https://github.com/pypa/pipenv/issues/1263), it is still not 100% clear to me whether Pipenv was designed to support the execution of applications in a production deployment environment, as opposed to a development environment. I have a project I'm trying to convert from setup.py
+ requirements.txt
over to Pipenv, but when it came time to run my console script defined in the entry_points
section of my setup.py
, as far as I can tell from the above, the only alternative is to write a wrapper shell script to either change directory to where my Pipfile is, then run pipenv run
, or load the Pipenv venv first, then just run python. These are things I never had to worry about when installing with pip
or setup.py
before -- they took care of creating and installing the wrapper scripts for me. So, is it a goal of Pipenv to support that use case, or not? And, if it is, do you have any plans to simplify that process?
BTW, if my application is structured as a set of Python packages, what's the expected way to include those packages in the venv with Pipenv? If I do pipenv install -e .
that requires me to have a setup.py
, which again defeats the purpose of the Pipfile in my understanding...
Thanks in advance for your time.
@luispollo Your "console script defined in the entry_points section of my setup.py" should work out of the box.
Keep your setup.py and run:
pipenv install -e .
to install your package in editable mode. When you pipenv shell, your-command
will be on the PATH. You can also run: pipenv run your-command
Split dependencies between setup.py and Pipfile as you deem fit. I generally keep it all in Pipfile for an _application_ (not library).
The setup.py file can be as small as:
from setuptools import setup, find_packages
setup(
name='myapp',
version='0.0.1',
packages=find_packages(),
entry_points={
'console_scripts': [
'your-command=myapp.cli:main'
]
},
)
Thanks @AlJohri. That's pretty much what I ended up with, but there are a couple of issues with this approach IMHO:
pipenv shell
.setup.py
which seems to (at least partly) defeat the purpose of using a Pipfile in the first place, again with no guidance on which of those two places I should keep my dependencies defined in.What I would prefer would be to do away with my setup.py
altogether, and to have a mechanism by which Pipenv itself was able to create/install wrapper scripts to execute my code, similar to what setuptools already provides, but with the benefit of the isolated virtualenv. But, as I said, I wasn't sure whether this was a design goal for Pipenv to begin with...
With pipenv-shebang you can run your script by inserting the shebang
#!/usr/bin/env pipenv-shebang
or you could run it with pipenv-shebang PATH/SCRIPT
directly.
@laktak 🥇
Most helpful comment
With pipenv-shebang you can run your script by inserting the shebang
or you could run it with
pipenv-shebang PATH/SCRIPT
directly.