← Back to context

Comment by firesteelrain

10 hours ago

Why not just use a Python container rather than rely on having the latest binary installed on the system? Then venv inside the container. That would get you the “venv of a version” that you are referring to

> Why not just use a Python container rather than rely on having the latest binary installed on the system?

Sometimes this is the right answer. Sometimes docker/podman/runc are not an option nor would the headache of volumes/mounts/permissions/hw-pass-through be worth the additional mess.

It is hard to over-state how delightful putting `uv` in the shebang is:

in `demo.py`:

    #!/usr/bin/env -S uv run
    # /// script
    # requires-python = ">=3.13"
    print("hello, world")

Then `chmod +x demo.py; ./demo.py`

At no point did I have a detour to figure out why `python` is symlinked to `python3` unless I am in some random directory where there is a half-broken `conda` environment...

Our firm uses python extensively and the virtual environment for every script or script is ... difficult. We have dozens of python scripts running for team research and in production, from small maintenance tools to rather complex daemons. Add to that the hundreds of Jupyter notebooks used by various people. Some have a handful of dependencies, some dozens of dependencies. While most of those scripts/notebooks are only used by a handful of people, many are used company-wide.

Further, we have a rather largish set of internal libraries most of our python programs rely on. And some of those rely on external 3rd party API's (often REST). When we find a bug or something changes, more often than not, we want to roll out the changed internal lib so that all programs that use it get the fix. Having to get everyone to rebuild and/or redeploy everything is a non-starter as many of the people involved are not primarily software developers.

We usually install into the system dirs and have a dependency problem maybe once a year. And it's usually trivially resolved (the biggest problem was with some google libs which had internally inconsistent dependencies at one point).

I can understand encouraging the use of virtual environments, but this movement towards requiring them ignores what, I think, is a very common use case. In short, no one way is suitable for everyone.

It's more complex and heavier than using uv. I see docker/vm/vagrant/etc as something as something I reach for when the environment I want is too big, too fancy or too nondeterministic to manually set up locally; but the entire point is that "plain Python with some dependencies" really shouldn't qualify as any of these (just like build environment for a random Rust library).

Also, what do you do when you want your to locally test your codebase across many Python versions? Do you keep track of several different containers? If you start writing some tool to wrap that, you're back at square one.

  • > what do you do when you want your to locally test your codebase across many Python versions?

    I haven’t found that there was any breakage across Python 3.x. Python 2.x to 3.x yes.

    Anyways, this all could be wrapped in a CICD job and automated if you wanted to test across all versions.

'we can't ship the Python version you want for your OS so we'll ship the whole OS' is a solution, but the 'we can't' part was embarrassing in 2015 already.

  • GP is referring to LTS versions though

    Many Linux distributions ship Python. Alpine and DSL don’t. You can add it to Alpine. If you want the latest, you install it.