Comment by shermantanktop
3 days ago
> no standard way of setting up an environment for MANY years hurt
Serious question: is that solved? I still see a forest of options, some of which depend on each other, and at last count my laptop has 38 python binaries. What's the standard way?
There's no "de jure" standard, but uv sure looks like it's on its way to becoming the "de facto" standard.
uv.
https://docs.astral.sh/uv/
It’s hard to call that standard, it’s just the latest hn rust craze idolisation.
Sure, but it’s demonstrably better than Poetry, which was the best until uv.
If uv isn’t a standard, it’s because not enough people have tried it. It is obscenely good at its job.
4 replies →
uv is an excellent piece of software regardless of the language used to write it. Really, if you do python, it's worth giving it a try, especially script mode.
1 reply →
Eh, I’m not so sure.
We didn’t see adoption nearly this fast for poetry, pipenv, or conda (or hatch or PDM, but I never saw those as even reaching critical mass in the first place).
Those tools got pretty popular, but it took a long time and most folks found them to have a lot of tradeoffs (miles better than Python’s first party tooling, but still).
I’m not seeing that with “uv”. Other than concerns about Astral’s stewardship model (which could be valid!), I’m not seeing widespread “it works but is hard to use” dissatisfaction with the to the way I do with, say, poetry.
Couple that with uv durably solving the need for pyenv/asdf/mise by removing the pain of local interpreter compilation entirely, and I do think that adds up to uv being fundamentally different in popularity or approach compared to prior tools. Is that “different” the same as “better”? Time will tell.
As to being written in Rust? Shrug. A ton of shops for whom uv has been transformative don’t even know or care what language it’s written in. Being Rust provides, in my opinion, two benefits: a) avoiding chicken-and-egg problems by writing the tool for managing a programming language environment in a different language that is b) not bash.
2 replies →
You forgot to update your HN craze list. Zig is chic, Rust is out.
No, it’s not. Everywhere I see, uv is adopted.
3 replies →
Yes. Try uv and never look back.
You still need pip-tools in uv environment
1 reply →
uv for project management and pipx for user-/system-wide tool installation.
uv handles that too with "uv tool".
2 replies →
What is Astral's business model?
> Serious question: is that solved?
It depends on what "setting up" means.
Creating an environment, given that the Python binary it will use is already installed, is trivial (standard library functionality since late 2012). So is choosing which environment to use. So is installing pre-built packages, and even legacy source packages are pretty easy (but slow, and installation runs arbitrary code which is entirely needless for these) when they only contain pure Python code. Even dependency resolution is usually not too bad.
The big problems are things like
* building multi-language packages from source locally, because this is expected to set up temporary local build environments (and build tools have to avoid recursion there)
* using external non-Python dependencies (essentially unsolved, and everyone works around this by either vendoring stuff or by not declaring the dependency and failing at runtime) — see https://pypackaging-native.github.io/ for an overview of the problems and https://peps.python.org/pep-0725/ for what they're trying to standardize to deal with it
* dealing with metadata for source packages; in the really general case you have to build the source to get this (although the package-building API now provides a hook so that build backends can specifically prepare metadata). This is mainly because some packages have dependencies that depend on very particular platform details that (apparently) can't be expressed with the "environment marker" scheme in standard metadata (https://peps.python.org/pep-0508/#environment-markers)
* and, of course, figuring out which packages need to be in your environment (Python won't decide for you what your direct dependencies are) and managing that environment over time. The reason all these other tools popped up is because Pip only installs the packages and offers very basic environment inspection; it's only now starting to do anything with lockfiles, for example, now that there is finally a standard for them (https://peps.python.org/pep-0751/).
But if you mean, is there a standard toolchain that does everything and will be officially blessed by the core language developers, then no, you should not ever expect this. There is no agreement on what "everything" entails, and Python users (a large fraction of which don't fit the traditional image of a "developer" at all) have widely varying workflows and philosophical/aesthetic preferences about that. Besides which, the core language team doesn't generally work on or care about the problem; they care about the interpreter first and foremost. Packaging is an arms-length consideration. Good news, though: the Python Packaging Authority (not at all authoritative, and named with tongue firmly in cheek, but a lot of people didn't get that) is stepping up and working on official governance (see https://peps.python.org/pep-0772/).
> at last count my laptop has 38 python binaries
Something has gone very wrong (unless you're on Windows, such that admin rights would be needed to create symlinks and by default `venv` doesn't try). To be clear, I mean with your setup, not with the tooling. You should only need one per distinct version of Python that your various environments use. I'd be happy to try to help if you'd like to shoot me an email (I use that Proton service, with the same username as here) and give more details on how things are currently set up and what you're trying to accomplish that way.