RIP Pipenv: Tried Too Hard. Do what you need with pip-tools.

Nick Timkovich
Telnyx Engineering
Published in
4 min readJan 24, 2020

Pipenv is dead. It went all of 2019 without a single release, despite about 650 commits to master since the last release. Comments from developers on the project explain it “has been held back by several subdependencies and a complicated release process”. In the past, there were also comments that internal changes to pip broke it, but pip is continuing to evolve in response to the PSF funding developers to work on it. Will they hit this moving target and make a release in 2020? Do I care?

Rocketing in popularity! (Image: NASA/Joel Kowsky)

My principal reasons for using Pipenv were twofold. First, was the ability to have reproducible builds through an exhaustively defined lock file, with every package and version specified, then hashed for good measure. Second, I could do the above while maintaining a clean, minimal list of direct dependencies for my application.

Pin everything…

The reason for a lock file should be clear: so you can have reproducible builds. pip install mypackage may work when you build and test on your workstation, and through one release cycle, but what about when you make a small fix later? Why did changing one line of code and rebuilding cause everything to explode? There was a new incompatible release among one of your dependencies. Precisely defining all packages and their exact versions leads to reproducible builds that are easier to diagnose and bisect when they go wrong.

Bonus features in the lock file include package hashes. It can improve security, but the failures I’ve seen are because of using multiple package indexes. At Telnyx we have a private PyPI that we use to distribute Python libraries. We adapted a synchronous library to asyncio, but later someone else did so and released it publically with the same name. The collision led to sometimes downloading the private version, sometimes the public, depending on how the indexes were configured on the user’s machine, their build container, or the CI infrastructure. Our private PyPI also allows re-uploading of the same version, which is also made clear when the hash check fails.

The locked list of dependencies should be deliberately generated and validated through tests. This is often by a developer on their workstation, and they can immediately check before entering a broken commit into the repo. Diffing the lock file will show what new versions were installed, and the version can be pinned to an older version, or source changed to accommodate the change.

…and also only what you need

OK, but why do we need a separate list of dependencies? Imagine a simple cycle of pinning everything in a singular requirements.txt and regenerating it with pip freeze. The file will not track why certain packages are installed, and if you explicitly remove a direct dependency from it, you’ll be keeping around all of its sub-dependencies for no reason. Additionally, you can enter version resolution hell wherein trying to bump one package for a new version means it needs another package, but a different version than you have pinned. Why is that package pinned? Are you using it directly where you care about the changes, or is it just for one of your deps? Or multiple?

In this minimal list, we can pin just what we need, excluding incompatible or buggy versions purposefully, and allowing incidental upgrades where we can. This can provide bug fixes and performance improvements (admittedly there can be regressions…), and preclude integrating with deprecated features in older libraries that will be a large maintenance burden later.

pip-compile: does one thing

Note that none of the features listed include virtual environment management. I use Pyenv for this on my machine, and in production, setting using pip and the built-in venv package is best. Why install anything when you can create a virtual environment in one line, and pip install with one more? They’re ever-so-slightly onerous to call if you use full paths, but you’re typing it once in the Dockerfile or other setup scripts anyways.

python3 -m venv /opt/appvenv
/opt/appvenv/bin/python3 -m pip install requirements.txt

So how do I get that requirements.txt while holding on to the features I liked? Enter pip-tools. Making a wiser decision than Pipenv, pip-tools goes with the UNIX philosophy of “do one thing, and do it well”. It provides a few commands, but one of them: pip-compile can bake one requirements-style file into another, complete with versions and hashes.

pip-compile --quiet \
--generate-hashes \
--output-file=requirements.txt \
requirements.in

Pipenv uses this library to do it — we can cut out the middleman! What’s more, the generated requirements.txt is perfectly readable by pip, so you don’t need to install the tooling in your production system, saving time, space, and exposure to bugs.

I think I’ve been guilty of inflating Pipenv’s download stats; every container build pulls it

How can you make the switch? If you’re willing to shake up the versions from your Pipfile.lock, converting the Pipfile to a requirements.in is sometimes just a slightly-tedious bunch of editing. However, if you have a lot of libraries, and want to keep their versions locked to make sure changing tools didn’t break anything, I wrote a script that will do it for you!

# in a path with a Pipfile and Pipfile.lock
# prereqs: `pip install toml pip-tools`
curl -sSL https://raw.githubusercontent.com/nicktimko/pipenv2tools/master/pipenv2tools | python -

So even if Pipenv came out with a new release fixing all the outstanding bugs and pain points I’ve come across, I won’t go back.

--

--

Nick Timkovich
Telnyx Engineering

Software Engineer | ChiPy regular, Having fun with Go | Gamer, Tinkerer, Baker, Hot Sauce Maker | now: Chicago, then: Grand Rapids, Michigan