uv is a Python tool for managing dependencies, builds, and runtimes. Inspired by major players in other languages like cargo and npm. It should further simplify the hassle of dealing with Python runtimes and environments and I would strongly recommend switching over to it.
- Cleanup
Get rid of pyenv
rm -rf $(pyenv root)
brew uninstall pyenv
// or however else you had pyenv installed. get rid of it
You should've had something like this in your ~/.bashrc, ~/.zshrc, etc. Remove it
export PYENV_ROOT="$HOME/.pyenv"
command -v pyenv >/dev/null || export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init -)"
Get rid of any remnants of pipenv
brew uninstall pipenv
python -m pip uninstall pipenv
Or something like that. Also remove any remnant virtual envs
rm -rf ~/.local/share/virtualenvs
You will probably have to delete and recreate any virtualenvs you have hanging around in specific projects.
It's probably ok to have a couple remnant python installations around -- Mac Developer Tools and Homebrew will have to install some of their own sometimes. uv is pretty good at picking up on this and avoiding redundancy.
It might be a good idea to make the installs for those runtimes clean before you get into uv. You can use these commands to wipe out everything installed by a given pip instance.
- Install it
curl -LsSf https://astral.sh/uv/install.sh | sh
See https://docs.astral.sh/uv/getting-started/installation/ for more
- Get set up with all the python versions that you will possibly need for now
uv python install 3.10 3.11 3.12 3.13
I have no idea how the out-of-the-box global default version is chosen. But you can change it like so:
uv python pin --global 3.13
If you're in a project (i.e. something with a pyproject.toml file), uv will respect the requires-python definition, so you don't have to worry if your global default conflicts with a particular package.
You can set a local default for a given repo:
uv python pin 3.10
This creates a .python-version file in that location. uv then resolves the python version to use by looking for that file. In general, I think this is kind of a personal thing and recommend that .python-version is included in your .gitignore. Our CI pipelines should be testing across all supported Python versions anyhow.
We have all encountered the pitfall of dependency/versioning issues caused by something being installed globally instead of a virtual environment scoped to a specific project, because you just typed pip install numpy without first activating the venv. uv provides a gated interface that helps with this.
First, there's a good chance that a uv-provided Python binary won't even give you the pip command on your $PATH:
$ which pip
pip not foundInstead, you can just get in the habit of typing uv pip instead (it's basically a drop-in replacement). If no virtual environment exists, it'll stop you from installation.
$ uv pip install requests
error: No virtual environment found; run `uv venv` to create an environment, or pass `--system` to install into a non-virtual environmentCreate a new virtual environment with uv venv. You can optionally pick the env location but the default, .venv, is pretty standard
$ uv venv
Using CPython 3.12.9
Creating virtual environment at: .venv
Activate with: source .venv/bin/activateNow everything installs properly (and, btw, you didn't have to activate the venv -- it knew where to put it):
$ uv pip install requests
Resolved 5 packages in 1ms
Installed 5 packages in 4ms
+ certifi==2025.1.31
+ charset-normalizer==3.4.1
+ idna==3.10
+ requests==2.32.3
+ urllib3==2.4.0You were prompted above to activate the virtual environment, and you should. But you can also just use uv run to work from a virtual environment, as long as the directory exists and uv can find it:
$ uv pip install ipython
Resolved 16 packages in 286ms
Installed 16 packages in 59ms
$ uv run ipython
Python 3.12.9 (main, Mar 17 2025, 21:36:21) [Clang 20.1.0 ]
Type 'copyright', 'credits' or 'license' for more information
IPython 9.1.0 -- An enhanced Interactive Python. Type '?' for help.
Tip: Use `ipython --help-all | less` to view all the IPython configuration options.
In [1]: import requests # this works because we installed it to the venv above!In open-source contexts, I think it's best to avoid trying to bind contributors to our specific build/package management tools as much as we can. For the most part, uv is pretty good about this (i.e. it just employs broadly-used config patterns rather than rolling its own), but there are some things (e.g. building/publishing) that we should probably be cautious about adopting for these reasons.
uv has a notion of a "project", which is a folder that contains pyproject.toml (define project configs/dependencies/etc), and a few other things, plus some actual Python code. If you're setting up new lab Python libraries, I'd recommend just building from the lab software template instead, but if you want to spin up a personal or side project, uv init will create a directory with all of the needed basics.
We classically have managed stuff like dependency declaration manually in pyproject.toml, and I think that's a good habit to keep doing, but you can also use commands like uv add to declare new dependencies. There are subcommands to specify dependency groups, version constraints, etc.
The uv sync command is a convenient way to immediately bring your environment up to date with what's declared in pyproject.toml:
git clone https://www.github.com/genomicmedlab/my-project
uv venv
uv sync --all-extras
This replaces the previous command we'd use for this, pip install -e ".[extras,dev,tests,etc].
uv also supports a lockfile format, uv.lock. In light of the above, I'm hesitant to prefer this over requirements.txt, its benefits are pretty niche (I think they relate to declaring alternatives based on system architecture, not something we ever worry about). Some of our projects do make use of lockfiles for deployment, and uv offers a pre-commit hook to check that those stay up-to-date with declared dependencies: https://docs.astral.sh/uv/guides/integration/pre-commit/
This is a big one. They have a guide at https://docs.astral.sh/uv/guides/integration/jupyter/
Basically, you can start a notebook from the project's venv like so
uv run --with jupyter jupyter lab
This is read-only, though, so you can't do the %pip install stuff magic within the notebook. They have some instructions above on how you can do that.
I don't really understand the vscode jupyter integration but they have some instructions for that too: https://docs.astral.sh/uv/guides/integration/jupyter/#using-jupyter-from-vs-code
uv run also provides a way to run handy Python applications on a one-off basis, without manually creating a virtual environment. For example, let's say I am writing a quick Python script and I want to format it with ruff format. I can just run:
uv tool run ruff format my_script.py
Or, for short
uvx ruff format my_script
Note that if you are in the context of a project or .venv, then uv run will resolve to apps installed or declared in that context first. But if not, then it'll just install/look them up from a global, virtual virtual environment.
You can also install tools so that they're globally accessible directly
$ uv tool install ruff
$ ruff --version
ruff 0.5.0In order to speed up installs, uv caches, like, everything, as much as it can. Then, when you go to install something to a new venv, it first checks for a local copy. Particularly if you're working with packages with very large binaries like pytorch (we usually aren't), your local cache might end up taking a bunch of space (I read about people suddenly realizing their uv cache is 20GB). Every so often, it's recommended that you run uv clean to clear your cache.
uv is a lot faster than pip. Its creator has a great talk on youtube exploring some of the ways they achieved this: https://www.youtube.com/watch?v=gSKTfG1GXYQ