Pipenv is very slow.
It takes a long time to download packages.
Install packages.
And lock file.
Unfortunately, this is true.
By the way, do you use some additional indexes (sources) in your Pipfile
?
If so, that is the possible reason of the issue.
https://github.com/pypa/pipenv/issues/2730#issuecomment-423599283
The "Poetry" (package manager) project has much faster hashing.
It would be cool if you would use their algorithm (or faster).
And I don't get why something simple like --help needs 5 seconds.
Maybe some code pathes should be lazy loaded.
We don鈥檛 use a special algorithm for hashing, we use sha256. I suspect poetry just trusts the hashes it鈥檚 told if it鈥檚 significantly faster at hashing. We re-hash every file for security.
If it takes 5 seconds to run simple arguments that鈥檚 an issue and you should open an issue about it with details. We鈥檇 be glad to merge anything that gets us performance gains without costs to functionality
Trusting the hashes? Sounds like a good idea if the hashes are checked when using the lock-file.
Pip checks the hashes too, doesn't it?
Anyway: do you request the packages sequentially? I suspect some loops, serial lookup and extensive object generation.
I think the slow help call is also a symptom of this.
When I scanned skimmed the code I saw that the vendor package is not a namespace package. Every request to vendor loads all packages in it => namespaces are lazy.
Even worse: patched standard packages. And patched is also no namespace package.
def _patch_path(): loads all vendored packages. Again performance!
Use a lazy loader which patches and caches on the fly.
Even worse: patched standard packages. And patched is like vendor: no
namespace package.
def _patch_path(): loads all vendored packages. Again performance!
Use a lazy loader which patches and caches on the fly (if you really
want to patch).
Pip has some options to specify cache directories, use them instead
patching.
In my opinion pipenv should also not use some fancy stuff like click
even colors are gone this way. It is just bloat for a core component. I
could not even pinpoint the slowdown because of so many dependencies
(conditional loading and capsulation could help).
I get slowly the gut feeling that a re-implementation would be easier
than fixing this project. That is what makes poetry fast.
Sry for the rant. I would really like to use pipenv (and used it in the
past) but it is too slow. When I had no good internet, poetry worked
within minutes while pipenv took several hours.
@devkral
using time
: pipenv --help 0,45s user 0,07s system 99% cpu 0,522 total
, so it is noticeable, but seeing how "rare" its to invoke pipenv without some even more time-consuming action (like installing packages) IMO its not a really a problem. Lazy loading would be nice, of course, but more benefit could be found in other places.
What I personally oppose is the notion to remove eye candies, that actually speed up lookup of information, just to get some milliseconds back of CPU time at the cost of milliseconds to seconds of human time (be it during use, or development/support time).
I tried it today for the first time. But it is taking more than 30 mins now to install ipython and jupyter:
after creating an env using pipenv using python 2.7 and then doing pipenv shell
, I thried following
pipenv install ipython[all] jupyterlab
as it is still Pinning VCS Packages...
which is just one of my python git repos.
even though there hasnt been new releases in a while ive noticed a marked slowdown in pipenv update
mainly because the travis ci build times out after 10m of inactivity. Nothing new in terms of extra packages.
@techalchemy Duplicate of https://github.com/pypa/pipenv/issues/2284 ?
Close for duplcating
if it takes a lot of time , its bullshit , sorry about my language
If I had time, I'd work on fixing this pipenv product. It's broken. It's extremely slow - no reason for it to be this slow. I recognize that the tool want's to evaluate dependencies but after it has decided that it can update, Pipenv spends so much time playing with the lock file. Further, the installation of certain packages - Tensorflow for example, do not work with Pipenv and Python 3. The cause - Pipenv does not inform the underlying setup.py that it is running a Python 3 environment. This leads to Tensorflow attempting to install a Python 2 dependency that is incompatible with Python 3.
Are we ready to stop this stupid reliance on supporting Python 2 and move forward?
I find even pipenv init is absurdly slow. I ended up removing it purely because of this. It was so slow doing anything, it was affecting my productivity, so I removed it in favour of virtualenv. Which is a shame, because pipenv looked like the best of all virtualenv, pyenv and venv...just it is horribly slow.
Which is a shame.
It's extremely slow and unneficcient
Most helpful comment
if it takes a lot of time , its bullshit , sorry about my language