← Back to context

Comment by buttsack

7 hours ago

When npm decided to have per-project node_modules (rather than shared like ruby and others) and human readable configs and library files I think the goal was to be a developer friendly and highly configurable, which it is. And package.json became a lot more than that as a result, it’s been a great system IMO.

Combined with a hackable IDE like Atom (Pulsar) made with the same tech it’s a pretty great dev exp for web devs

It’s one thing or another.

Python had shared packages for a long time and those are fine up to a point but circa 2017 I was working at a place where we had data scientists making models using different versions of Tensorflow and stuff and venv’s are essential to that. We were building unusually complex systems and having worse problems than other people but if you do enough development you will have trouble with shared packages.

The node model of looking for packages in the local directory has some appeal and avoids the need for “activation” but I like writing Python-based systems that define one or more command line programs that I can go use in any directory I want. For instance, if I want to publish one of my Vite projects I have a ‘transporter’ written in Python that looks for a Vite project in the current directory and uploads it to S3, updates metadata and invalidates cloudfront and all that. I have to activate it which is a minor hassle but then I can go to different Vite projects and publish them.

  • I still prefer shared packages because it incentivizes developer to have a stable API. And you always have an option to manipulate the path variables and have projects (java) and virtual env (python). Cargo and NPM always seems to be straight from Alice’s dreams (Lewis Caroll).