← Back to context

Comment by eyegor

3 years ago

I usually make a venv in ~/.venv and then activate it at the top of any python project. Makes it much easier to deal with dependencies when they're all in one place.

i am a big fan of .venv/ -- except when it takes ~45 mins to compile the native extension code in question -- then I want it all pre-packaged.

  • At this stage [0], uncompiled native extensions are not yet a bug, but a definite oversight of the maintainer. They should come as precompiled wheels

    [0]: https://pythonwheels.com

    • Honestly I don't think I've ever used a precompiled package in Python. Every single C stuff seems to take ages and requires all that fun stuff of installing native system dependencies.

      Edit: skimming through this page, precompiling seems like an afterthought, and the linked packages don't even seem to mention how to integrate third-party libraries. So I guess I can see why it doesn't deliver on its promises.

      8 replies →

  • It's a good idea to be caching sdists and wheels — for resilience against PyPI downtime, for left-pad scenarios, and even just good netiquette — and for packages that don't have a wheel for your environment, you can fairly easily build that wheel yourself and stick it into the cache.

second this and it's what I do on all Linux distros, just run it inside .venv as the site-installation.

if you need extra dependencies that pip can not do well in the .venv case, Conda can help with its own and similar site-based installation.

I don't know how it is different in the python installation case between ubuntu and debian, they seem the same to me.