Inline run dependencies in pipx 1.4.2

While it can also do much more, Python is a fantastic language for writing small scripts and utilities with it’s expressive syntax and batteries-included standard library. But what if you need just a bit more? PyPI is one of the best package repositories for any language, and being able to access it without having to write a multi-file library and setting up virtual environments would be a dream - one that is becoming reality. Pipx 1.4.2 has an experimental implementation of the provisionally accepted PEP 723, and I’d like to show it off here, as it’s tremendously useful for simple scripts & utilities.

The solution

You can now write files that look like this:

# /// script
# dependencies = ["rich"]
# ///

import rich

rich.print("[blue]This worked!")

And run them with pipx run:

$ pipx run ./print_blue.py

This will check to see if it’s run a script with this environment before; if it hasn’t, then it will make an environment with the listed run dependencies. Then it will run the script in this environment. That’s it! (Remember to include the ./ to ensure pipx run doesn’t try to find a PyPI package named print-blue-py).

This is great for one off scripts, for various tools, and even when sharing code such as in tutorials or Gists. You can specify exactly that the run requirements are in the script itself.

This inline dependency idea isn’t new; tools like pip-run have provided custom syntaxes for inline dependencies before. But this is finally providing a unified syntax that we can all agree on, which will enable different tools and editors to process these as well.

Also, this is just the beginning. There’s also a requries-python field, which is currently ignored by pipx, but in the future could influence what Python it searches for or cause it to produce a (slightly better) error if it doesn’t match. Other tools can read tool.<stuff> fields in here; Ruff has expressed interest in allowing pre-file configuration by setting tool.ruff items here.

The problem

When writing any code, being able to use libraries provided by others is one of the most important parts of programming. Without that, we will be stuck rewriting the same bits of code forever, never advancing. But installing libraries is hard - you really should be using virtual environments - one for each independent piece of code (at least with unique requirements), and that’s a lot of effort that currently is rarely done correctly. The most common solution is to have a “dev” environment, either defined in requirements.txt, or in a [dev] extra. And all scripts have all requirements in this one place, hoping that there are no collisions (black and tensorflow were not installable in the same environment at one point, for an example). This doesn’t work very well if this isn’t already a package, though, and still requires a lot of special knowledge to set up (like the location of the requirements.txt file.

Previous solutions

An older solution to this is using a task runner, like tox, nox, or hatch. These allow multiple environments and make management trivial. But they split the environment specification into multiple files (tox.ini, noxfile.py, or pyproject.toml), while the solution above has fully self contained single file scripts. For example, in nox, the above example would require the following:

noxfile.py:

import nox


@nox.session()
def print_blue(session: nox.Session) -> None:
    """
    Runs the print_blue script.
    """

    session.install("rich")
    session.run("python", "print_blue.py", *session.posargs)

(Technically you could cut a few bits of this due to the simplicity of the script in our example) and you would run it with:

$ nox -s print_blue

tox.ini:

min_version = 4.0

[testenv:print_blue]
description = Runs the print_blue script.
skip_install = True
deps = rich
commands = python print_blue.py {posargs}

(Technically you could cut a few bits of this due to the simplicity of the script in our example) and you would run it with:

$ tox -e print_blue

pyproject.toml:

[tool.hatch.envs.print_blue]
description = "Environment for the print_blue script."
detached = true
dependencies = ["rich"]
scripts.print_blue = "python print_blue.py {args}"

(Technically you could cut a few bits of this due to the simplicity of the script in our example - also, you don’t need to “export” the script like this) and you would run it with:

$ hatch run print_blue:print_blue

(Soon Hatch will also support PEP 723, by the way)

Conclusion

I’d highly recommend trying it out, and seeing what you think. I think this will transform the way we write small scripts in Python, and I’m looking forward to seeing Python packaging continuing to get simpler. Rust’s Cargo also recently got am inline script dependencies as a nightly feature, too.

If you are interested in dev dependencies for projects that do have pyproject.tomls, also see PEP 735, has been proposed to cover dependency groups.

comments powered by Disqus