I was inspired to ask about specifying the project dir as a command line arg (https://github.com/pypa/pipenv/issues/2237) while thinking about invoking tools in their own venvs. At some point I realized that I could use a venv's python
to invoke a tool and it would just work without having to activate anything in my shell. I assume this is entirely by design.
With pipenv managing venv paths, it might be compelling to be able to invoke a tool via pipenv, where pipenv would determine the venv from the tool's path and not the current working directory. Perhaps pipenv could be the tool runner on the shebang line, something like this:
#!/usr/bin/env pipenv run
import mystuff
import sys
def main(args):
# And so forth.
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv[1:]))
Is this compelling? Or am I missing a different best practice for organizing tools with their own venvs? Or do people just prefer to not do this and use global site-packages for everything they intend to run on their own machines? (And if I'm dev'ing my own tools, I would do actual releases into my everyday-carry environment, and use venvs just for development?)
you should do $ pipenv --venv
and point to the python within that venv
Right, I get that I can manually determine the venv path and just put that path into my script. I'm asking if it'd be compelling to do this programmatically from the shebang line itself. As pipenv "owns" the translation of a project path to a venv path, it'd be more portable and more future-proof if a script knows how to ask pipenv every time.
I'd like to be able to clone a Github repo into project/
, do a one-time cd project && pipenv install
, then ever after just run project/tool
without having to cd project && pipenv shell
first. Because the venv path is be specific to my machine, tool
's shebang line can't know it in advance.
I can't think of how to do this without additional support from pipenv that doesn't involve a relatively sophisticated wrapper script that invokes pipenv --venv from the project/
subdir and spawns a subshell to run another script with the appropriate Python interpreter. I'd gladly accept a solution that just makes such wrappers easier to write, such as a pipenv.run_in_env('path/to/script.py')
Python function similar to subprocess.run()
.
(I realize shebang lines are specific to certain shells. It's just the first thing that occurred to me because it gives pipenv the opportunity to pick the Python interpreter.)
I just encountered issues with setting up tool scripts to just run within pipenv
too. Here's the instructions I want to give to other people on my team, without them having to delve further into how the pipenv
tool itself works (example below for a Mac):
brew install pipenv
git clone [email protected]:tools.git
cd tools && pipenv install
./tool1.py
Is there a way to do that?
I think this issue is still outstanding (despite being marked as closed). Presumably the path referred to by @kennethreitz is not suitable for e.g. being checked in to a git repo? A generic way to launch a tool without setting up an environment first would be great. (Imagine if standard unix tools required environment setup before you ran them...)
This is an issue, but not one Pipenv wishes to get into. Pipenv is a tool to manage environments, not to help package and redistribute your application. You should use other tools for that purpose.
I think I agree - why can't your setup script install your virtual environment in a specific location everytime?
It can, but I think this feature is useful when you're specifically not packaging or distributing an application, just trying to run an arbitrary script with an arbitrary environment.
What do you do when you just want a script in your home running against a specific environment? Copy and paste the magic path into the #! line? The proposed solution would be nicer IMHO.
The proposed solution is nicer, but impossible, AFAICT. I would be happy to be proven wrong.
This should do it (works for me):
#!/usr/bin/env pipenv run python
@cfzlp /usr/bin/env: ‘pipenv run python’: No such file or directory
I believe the method you propose works on BSD (and macOS), but not Linux.
@cfzlp
/usr/bin/env: ‘pipenv run python’: No such file or directory
I believe the method you propose works on BSD (and macOS), but not Linux.
You're right. I didn't know this, but on Linux execve behaves differently and passes 'pipenv run python' as a single argument to env
and that doesn't work...
http://mail-index.netbsd.org/netbsd-users/2008/11/09/msg002388.html
Using the following one-liner as a wrapper is a nice workaround:
#!/bin/sh
pipenv run $(basename $0) "$@"
You can - for instance - call it /somepath/bash
or symlink /somepath/bash
to it, and use it in your scripts as:
#!/usr/bin/env /somepath/bash
which python # should show a path inside your pipenv
This works around both Linux shebang arguments limitation and BSD's/OSX' limitation on non-binary shebang interpreters.
And this way you can basically wrap anything - any shell or python
itself - with minimal effort.
Note: this wrapper or its symlink cannot appear in the $PATH
before the executable you're trying to wrap, otherwise you'll get an infinite interpreter loop.
you should quote $0
as well and use exec:
#!/bin/sh
exec pipenv run "$(basename "$0")" "$@"
not sure if all shells can handle nested quotes, so maybe even:
#!/bin/sh
script=$(basename "$0")
exec pipenv run "$script" "$@"
Slightly changing what @cfzlp suggested works for me on Debian (coreutils version 8.30):
#!/usr/bin/env -S pipenv run python
@fernandezcuesta Unfortunately -S
is not a recognized option for env
on Ubuntu.
FWIW with Nix you can just do e.g.
#! /usr/bin/env nix-shell
#! nix-shell -i python3 -p "python3.withPackages(ps: [ps.numpy])"
@laktak: portable version is /usr/bin/env
, not /bin/env
; also you should port your shell script to /bin/sh
, not /bin/bash
to be portable.
Thanks for the input, I've updated it.
Most helpful comment
This should do it (works for me):
#!/usr/bin/env pipenv run python