← Back to context

Comment by oefrha

10 hours ago

For much of the ML/scientific ecosystem, you're lucky to get all your deps working with the latest minor version of Python six months to a year after its release. Random ML projects with hundreds to thousands of stars on GitHub may only work with a specific, rather ancient version of Python.

> Because otherwise this problem is trivially solved by anyone competent. In particular, building and installing Python from source is just the standard configure / make / make install dance, and it Just Works. I have done it many times and never needed any help to figure it out even though it was the first thing I tried to build from C source after switching to Linux.

I compiled the latest GCC many times with the standard configure / make / make install dance when I just started learning *nix command line. I even compiled gmp, mpfr, etc. many times. It Just Works. Do you compile your GCC every time before you compile your Python? Why not? It Just Works.

> Why not?

Time. CPython compiles in a few minutes on an underpowered laptop. I don't recall last time I compiled GCC, but I had to compile LLVM and Clang recently, and it took significantly longer than "a few minutes" on a high-end desktop.

> Random ML projects with hundreds to thousands of stars on GitHub may only work with a specific, rather ancient version of Python.

Can you name some?

> Do you compile your GCC every time before you compile your Python? Why not? It Just Works.

If I needed a different version of GCC to make Python work, then probably, yes. But I haven't yet.

Just like I barely ever need a different version of Python. I keep several mainly so that I can test/verify compatibility of my own code.