Goodbye Virtual Environments?
If you’re a Python developer you’ve likely heard of Virtual Environments. A Virtual Environment is “a self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.”
Why are they so popular? Well, they solve a problem: no longer are packages installed into a mishmash of global site-packages. Instead, each project can install precise dependency versions into their own “virtual” environments.
However, they introduce some problems as well:
- Learning curve: explaining “virtual environments” to people who just want to jump in and code is not always easy
- Terminal isolation: Virtual Environments are activated and deactivated on a per-terminal basis
- Cognitive overhead: Setting up, remembering installation location, activating/deactivating
To solve some of the above points, new higher level tools such as pipenv, poetry, flit, and hatch have been created. These tools improve the situation by hiding the complexities of pip and Virtual Environments. However, they become complex themselves in order to hide complexity. They also have their own API’s, learning curves, and maintenance burden.
Enter PEP 582 — Python local packages directory
A Python Enhancement Proposal (PEP) was introduced back in May of 2018 to modify Python itself to solve many of the problems Virtual Environments solve, but in a totally different and much simpler way. Note that it’s still in draft state so the proposal may still change, or it may not even be adopted at all.
I’ll let PEP 582 speak for itself:
This PEP proposes to add to Python a mechanism to automatically recognize a __pypackages__directory and prefer importing packages installed in this location over user or global site-packages. This will avoid the steps to create, activate or deactivate “virtual environments”. Python will use the__pypackages__ from the base directory of the script when present.
This proposal effectively works around all the complexity of Virtual Environments and their higher level counterparts simply by searching a local path for additional packages.
Try it Today
It even comes with a reference CPython implementation.
If you don’t have the time or desire to build a CPython binary, you can try a proof of concept Python wrapper I made called pythonloc (for “python local”). It is a Python package (less than 100 lines of code) that does exactly what PEP 582 describes.
pythonloc runs Python, but will import packages from
__pypackages__ , if present. It also ships with
piploc which is the same as
pip but installs/uninstalls to
Here is an example.
> piploc install requests
Successfully installed certifi-2018.11.29 chardet-3.0.4 idna-2.8 requests-2.21.0 urllib3-1.24.1
> tree -L 4
> python -c "import requests; print(requests)"
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
> pythonloc -c "import requests; print(requests)"
<module 'requests' from '/tmp/demo/__pypackages__/3.6/lib/requests/__init__.py'>
Note: This is identical to what you might find in your site packages directory, i.e.
As you can see
__pypackages__/3.6/lib/requests . Running
python demonstrated that it did not find
requests (which is expected since it doesn’t search
To make Python find it, you can run
pythonloc, which is roughly the same as running
PYTHONPATH=.:__pypackages__:$PYTHONPATH python. This searches
__pypackages__ and finds the
requests installation. 🎉
You can try
pythonloc today by running
pip install --user pythonloc
and can learn more at https://github.com/cs01/pythonloc.
Installing Multiple Dependencies or Lockfiles
If you have the source code available and it has a
setup.py file, you can run
piploc install .
pythonloc and have all your dependencies available.
If you have a
requirements.txt file, you can run
piploc install -r requirements.txt
If you are using pipenv you can generate a
requirements.txt file with
pipenv lock --requirements
And finally if you are using poetry you can generate
poetry run pip freeze > requirements.txt
Okay so we can install from various sources, but what if we’re developing and want to generate a list of dependencies.
A new workflow you could use with the advent of
__pypackages__ is to work around creating a list of dependencies and actually commit
__pypackages__ itself to source control. Doing that would virtually guarantee you’re using the same versions because, well… you’re using the exact same source code.
Assuming you don’t want to do that, you could run
piploc freeze . But this presents a problem. It shows all installed python packages: those in
site-packages as well as in
__pypackages__. This probably isn’t want you want because it includes more than what you installed to
You likely only want to output the packages installed to
__pypackages__ . That is exactly what
It is the equivalent of
pip freeze but only outputs packages in
__pypackages__. This is required because there is no built-in way to do this with pip. For example, the command
pip freeze --target __pypackages__ does not exist.
So instead of running
pip freeze > requirements.txt
you would run
pipfreezeloc > requirements.txt
PEP 582 is a draft proposal that introduces a new way to install and isolate packages without Virtual Environments. It also eliminates indirection between project location and environment location, since the installation location is always in the same directory as the project — at
pipfreezeloc) is a proof of concept Python implementation of PEP 582 available today.
What do you think? Should PEP 582 be approved? (note I am not a decision maker in this process, just an interested observer) Are Virtual Environments going to be relied on less ? Does
pythonloc improve your workflow?