The Author isn't arguing for not using third party dependencies.
He's arguing for developers to be more conscious of the dependencies they use, by manually vetting and handling them. That screams "I've been down the package manager route and paid the price". Not inexperience.
> He's arguing for developers to be more conscious of the dependencies they use
"be careful all the time" doesn't scale. Half of all developers have below-average diligence, and that's a low bar. No-one is always vigilant, don't think that you're immune to human error.
No, you need tooling, automation to assist. It needs to be supported at the package manager side. Managing a site where many files are uploaded, and then downloaded many times is not a trivial undertaking. It comes with oversight responsibilities. If it's video you have to check for CSAM. If it's executable code, then you have to check for malware.
Package managers are not evil, but they are a tempting target and need to be secured. This can't just be an individual consumer responsibility.
I can't speak for other ecosystems, but some NuGet measures are here:
I believe that there have been (a few) successful compromises of packages in NuGet, and that these have been mitigated. I don't know how intense the arms race is now.
I disagree with this take. There should be just more governance on the registry side of things.
For NuGet or Maven I think dependency hell is not something you run into and I don’t have package manager manager for those languages.
There should be enough trust just like I can do sudo apt install.
His take screams „I want to push my niche approach and promote my language from my Ivory Tower of language creator”. He still might not have any relevant experience building businesses line software just like O don’t have experience with building compilers or languages.
Inexperience of an author who develops quite successful programming language for like 10 years?
Quite a bold statement.
Actually his perspective is quite reasonable. Go is in the other part of the spectrum than languages encouraging "left-pad"-type of libraries, and this is a good thing.
I've seen plenty of intelligent people acting pretty stupid.
As my psychology professor used to say. "Smart is how efficiently use your intelligence. Or don't."
So someone pretty low IQ can be smart - Forrest Gump. Or someone high IQ can be dumb occasionally - a professor so very attuned to his research topic at expense of everything else.
Sure... and, to prove your point, Go has a package manager too (although it's a relatively new addition). But Go still follows a "batteries included" approach, where "standard" stuff (yes, even database handling) is handled by the standard library. Which still leaves lots of other things for which you need third party packages, but those will be typically far fewer than in other languages.
I think the argument presented, is that whatever a Go package does, it does well.
Btw the Js ecosystem also has quite a few good packages (and a ton of terrible ones, including some which everyone seems to consider as the gold standard).
I don't see the value in making it even harder to build software. I want to make things. Downloading a dependency manually and then cursing at the compiler because "it's right there! why won't it load!!" is just gonna make me want to build software less.
Anyone I want to work with on a project is going to have to have the same frustration and want to work on the project less. Only even more because you see they downloaded version 2.7.3-2 but the version I use is 2.7.3-1.
Some distos might try to support multiple versions of a library. That could require installing it to different prefixes instead of the default. Thus, the build system will have to comprehend that.
This article, although is trying to provide some arguments as for why package managers are "evil", I found the argumentation pretty weak/non-descriptive. It's good if you have the experiences that confirm a specific point of view, but I think these experiences need to be explained in some more detail, because people reading your article may have similar experiences and therefore would find it hard to agree with your points - just like me.
To give a concrete example, you said that javascript does not have a definition of a "package" in its langauge. But what does that really mean, and why should it lead to package manager managers? Because for me, a person who has worked with javascript just a little bit, I know package.json exists and most of the package managers I've worked with agree on what the contents of this file mean. If we limit our understanding to just npm, yarn and probably bun, we don't see how that causes or contributes to the dependency hell problem (sure it exists, but how?).
You said that Go mitigates the issue of dependency hell to some degree, but this is an interesting thought, give it more exploration! Why should something like Go not have this problem not be not as severe as in Javascript?
I may not remember the details of what you said in the article and I would like to check, but currently I can't access the site because it times-out for me.
package.json is a convention, not a language definition, hence package managers may implement "package" management differently; in reality conventions are followed, until they aren't, and that's where hell begins (if something can be abused - it will be); go and odin define package in the language itself as a folder containing source files, and they mitigate many management issues by just having a good standard library, so you wouldn't need as many packages to begin with;
In general, I think the dependency hate is overblown. People hear about problems with dependencies because dependencies are usually open source code used by a lot of people so it is public and relevant. You don't hear as much about problems in the random code of one particular company unless it ends up in a high profile leak. For example, something like the heartbleed bug was a huge deal and got a lot of press, but imagine how many issues we would be in if everyone was implementing their own SSL. Programmers often don't follow best practices when they do things on their own. That is how you end up with things like SQL injection attacks in 2025.
Dependencies do suck but it is because managing a lot of complicated code sucks. You need some way to find issues over time and keep things up to date. Dependencies and package managers at least offer us a path to deal with problems. If you are managing your own dependencies, which I imagine would mean vendoring, then you aren't going to keep these dependencies up to date. You aren't going to find out about exploits in the dependencies and apply them.
> imagine how many issues we would be in if everyone was implementing their own SSL.
No, the alternative is to imagine how many issues we would be in if every project pulled in 5 different SSL libraries. Having one that everybody uses and that is already installed on everyone's system is avoiding dependency hell. Even better if it's in stdlib.
I see this a lot with Rust where I will depend on one or two external crates for a simple application and then I am shocked to see dozens of dependencies being pulled in when I go to build. I actually think Cargo's support for feature gates and conditional compilation could in theory be a strong mitigation against this as crates can avoid pulling in dependencies unless you actually need a feature that relies on them, but in practice it doesn't seem to work that way as I often see these complaints about Rust.
I sympathise with the arguments but IMO laziness will always win out. If Rust didn't have Cargo to automate dependency hell, someone would create a third party script to fill the gap.
When I worked at Google every single dependency was strictly vendored (and not in the mostly useless way that Cargo vendors things). There was generally only one version of a dep in the mono repo, and if you wanted something.. you generally got to own maintaining it, and you had to make sure it worked for every "customer" -- the giant CI system made sure that you knew if an upgrade would break things. And you reached out to stakeholders to manage the process. Giant trains of dependencies were not a thing. You can do that when you have seemingly infinite budget.
But technology can indeed make it worse. I love Rust, but I'm not a fan of the loose approach in Cargo and esp Crates.io, which seems to have pulled inspiration from NPM -- which I think is more of a negative than positive example. It's way too easy to make a mess. Crates.io is largely unmoderated, and its namespace is full of abandoned or lightly maintained projects.
It's quite easy to get away with a maze of giant transitive deps w/ Cargo because Rust by default links statically, so you don't usually end up in DLL hell. But just doing cargo tree on the average large Rust project is a little depressing -- to see how many separate versions of random number generators, SHA256, MD5, etc libs you end up with in a single linkage. It may not be the case that every single one is contributing to your binary size... but it's also kind of hard to know.
Understanding the blast radius of potential issues that come from unmoderated 3rd-party deps is I think something that many engineers have to learn the hard way. When they deal with a security vulnerability, or a fundamental incompatibility issue, or have to deal with build time and binary size explosions.
I wish there was a far more mature approach to this in our industry. The trend seems to be going in the opposite direction.
In many ways traditional Linux distros operate on similar model as I imagine googles monorepo. Both aim to this "globally consistent" dependency situation where you have one version of each library and you patch up things from upstream when they don't fit.
I feel we need more of these kinds of distros so you don't need to manage dependencies directly from upstream and deal with the integration effort yourself. What if we had a Rust disto following this same model, where there is only one version of each dep, some reasonable curation, and also you had nice clear release cycles? I feel that could real boon for the industry.
> If Rust didn't have Cargo to automate dependency hell, someone would create a third party script to fill the gap.
Possibly but not guaranteed. Some other languages without a built in package manager haven't had an external one manage to take over the ecosystem, most (in)famously C and C++, while others have.
Most language users will follow the "spirit" of the language - e.g. Bill is against package managers, people who use his language mostly agree with his ideas, and there's not a huge standard Odin package manager.
I rather appreciate that C and C++ don't have a default package manager that took over - yes, integrating libraries is a bit more difficult, but we also have a lot of small, self-contained libraries that just "do the thing" without pulling in a library that does colored text for logging, which pulls in tokio, which pulls in mio, which pulls in wasi, which pulls in serde, which is insane.
The package manager for C/C++ is apt, or rpm, or whatever package manager your system uses. These package managers were designed for the world of C/C++ software so it's less surprising that these languages haven't found as much of a push towards language package managers.
Rust’s big issue here is the anemic standard library. I think overall the strategy makes some amount of sense; since there’s so much crazy alchemy like depending on nightly, no_std, etc in Rust, including stuff in std has more downside in Rust than in a language that’s more stable like Go.
But it’s annoying to have to deal with 3 different time libraries and 3 different error creation libraries and 2 regex libraries somehow in my dependency tree. Plus many packages named stuff like “anyhow” or “nom” or other nonsense words where you need to google for a while to figure out what a package is supposed to do. Makes auditing more difficult than if your library is named structured-errors or parser-combinator.
I don’t like go programming language but I do like go tooling & go ecosystem. I wish there was a Rust with Go Principles. Swift is kinda in the right ballpark, packages are typically named stuff that makes sense and Swift is closer to Rust perf and Rust safety than Go perf and Go safety. But Swift is a tiny ecosystem outside of stuff that depends on the Apple proprietary universe, and the actual APIs in packages can be very magical/clever. ¯\_(ツ)_/¯
The very sparse std is one of the few genuine mistakes I think Rust has made. I know the arguments for it, but I don't find them persuasive. A batteries included standard library, in my view, is just plain better and every modern language should have one.
Just as the Rust community has largely converged on tokio as the standard async runtime, is there any reason why there couldn't exist a community-developed "batteries-included" standard library other than writing a standard library being a tedious and thankless task?
> How do I manage my code without a “package manager”? [...] Through manual dependency management.
Slackware Linux does precisely that.
I'm a Slackware user. Slackware does have a package manager that can install or remove packages, and even a frontend that can use repositories (slackpkg), but it does have manual dependency resolution. Sure, there are 3rd-party managers that can add dependency resolution, but they do not come with the distro as default.
This is a very personal opinion, but manual dependency management is a feature. Back in the day, I remember installing Mandrake Linux 9.2 and activating the (then new-ish) framebuffer console. The distro folks had no better idea than to force a background "9.2" image on framebuffer consoles, which I hated. I finally found the package responsible for that. Removing it with urpmi, however, meant removing all the graphical desktop components (including X11) because that stupid package was listed as a dependency of everything graphical.
That prompted me to seek alternatives to Mandrake and ended up using Slackware. Its simplicity had the added bonus of offering manual dependency resolution.
> Dependency hell [0] is a real thing which anyone who has worked on a large project has experienced. Projects having thousands, if not tens of thousands, of dependencies where you don’t know if they work properly, where are the bugs, you don’t how anything is being handled—it’s awful.
Same. The first thing I thought was "wait a second, that isn't dependency hell".
The second thing is that their version of dependency hell - having lots of dependencies introducing lots of bugs that you would not have written - is not my experience. 99% of the time, my bugs are in my own code, lol. Maybe once you become a much better programmer than me, you stop writing bugs in your own code and instead start having to deal with bugs in the PNG parsing library you depend on or something, and at that point writing your own PNG parsing library becomes a good use of your time. But I'm certainly not at that point.
I've had to fix bugs in dependencies of course. Here is one I fixed yesterday [0]. But it's much closer to the exception than the rule.
I don't know what the solution to this problem is, but I do remember a time (around 20 years ago) when this wasn't a big problem. Was working on a fairly large (each module between 50k - 100k LOC) C++ system. The process for using libraries:
1) Have problem that feels too complicated to hand-code.
2) Go on Internet/forums, find a library. The library is usually a small, flat collection of atomic functions.
3) A senior engineer vets the library and approves it for use.
4) Download the stable version: header file, and the lib file for our platform (on rare occasions, build it from source)
5) Place the .h file in the header path, and the lib file in the lib path; update the Makefile.
6) #include the header and call functions.
7) Update deployment scripts (bash script) to scp the lib file to target environment, or in some cases, use static linking.
8) Subscribe to a mailing list and very occasionally receive news of a breaking change that requires a rebuild.
This may sound like a lot of work, but somehow, it was a lot less stressful than dealing with NPM and node_modules today.
I think the main thing that makes this workable is "The library is usually a small, flat collection of atomic functions."
I find that it's the hell of transitive dependencies--you as a developer can reasonably vet a single layer of 10-30 standalone libraries. But if those libraries depend on other libraries, etc, then it balloons into hundreds or thousands of dependencies, and then you're sunk.
For what it's worth, I don't think much of this is essential complexity. Often a library is complicated because it supports 10 different ways of using it, but when you use the library, you're only using 1 of those ways. If everyone is only using 10% of thousands of transitive dependencies, the overall effect is incredibly complicated, but could have been achieved with 10-100% more short-term effort. Sure, "it took twice as long to develop but at least we don't have 10x the dependencies" is a hard sell to management (and often to ourselves), but that's because we usually choose to ignore the costs of depending on software we don't understand and don't control. We think that we're cleverly avoiding having to maintain and secure those libraries we outsourced, but most open-source developers aren't doing a great job of that anyway.
Often it really is easier to develop something from scratch, rather than learn and integrate a library. Not always though, of course.
In C and C++ you don't need the transitive dependencies for compilation, you only need the header of the direct dependencies. As for linking they are only needed when linking dynamically, which was much less prevalent 20 years ago.
This reads much more like a critique of traditional open-source development than package managers themselves.
The author asserts that most open-source projects don't hit the quality standards so that their libraries can be just included, and they'll do what they say.
I assert that this is because there's no serious product effort behind most libraries (as in no dedicated QA/test/release cycle), no large commercial products use it (or if they do, either they do it in a very limited fashion, or just fork it).
Hobbyists do QA as long as it interests them/fits their usecase, but only the big vendors do bulletproof releases (which in the desktop realm seems to be only MS/Apple)
This might have to do with the domain the author chose - desktop development has unfortunately had the life sucked out of it with every dev either being a fullstack/cloud/ML/mobile dev, its mindshare and the resources going toward it have plummeted.
(I also have a sneaking suspicion the author might've encountered those bugs on desktop Linux, which, despite all the cheerleading (and policing negative opinions), is as much as a buggy mess as ever.
In my experience, it's quite likely to run into a bug that nobody has written about on the internet ever.
This is why I'm so glad that I work in a closed monorepo now. There is no package management, only build tooling.
I find myself nodding along to many of the technical and organizational arguments. But I get lost in the licensing discussion.
If it is a cultural problem that people insist on giving things away for free (and receiving them for free), then viral licenses can be very helpful, not fundamentally pernicious.
Outside of the megaprojects, my mental model for GPL is similar to proprietary enterprise software with free individual licenses. The developer gets the benefits of open projects: eyeballs, contributors, adoption, reputational/professional benefits, doing a good deed (if that motivates them) while avoiding permissively giving everything away. The idea that it's problematic that you can't build a business model on their software is akin to the "forced charity" mindset—"why did you make something that I can't use for free?"
If you see a GPL'd bit of code that you really want to use in your business, email the developers with an offer of $X,000 for a perpetual commercial license and a $Y,000/yr support contract. Most are not so ideologically pure to refuse. It's a win-win-win: your business gets the software, the developers don't feel exploited, noncommercial downstream users can enjoy the fruits of open software, and everybody's contributed to a healthier attitude on open source.
> "This is the automation of dependency hell. The problem is that not everything needs to be automated, especially hell. Dependency hell is a real thing which anyone who has worked on a large project has experienced. Projects having thousands, if not tens of thousands, of dependencies where you don’t know if they work properly, where are the bugs, you don’t how anything is being handled—it’s awful.
This the wrong thing to automate. You can do this manually, however it doesn’t stop you getting into hell, rather just slow you down, as you can put yourself into hell (in fact everyone puts themselves into hell voluntarily). The point is it makes you think how you get there, so if you have to download manually, you will start thinking “maybe I don’t want this” or “maybe I can do this instead”. And when you need to update packages, being manual forces you to be very careful."
I sympathise with this, but I have to respond that we have to live within existing ecosystems. Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
Packages themselves are not bad. NPM is just fine - so long as you don't let it do dependency resolution and lock the version of every package. Note that this means you have to get notified when each package is updated (how!) and make a decision on how to update it (or if you decide not to update make a decision to maintain it).
The other thing is your package manager cannot go out to the internet randomly. You need it to download from a place you are comfortable with (which might or might not be the default) existing as long as you need packages, and that will keep the versions of packages you want around. If you are a company project that means an internal server/mirror because otherwise something you depend on will disappear in the future. (most of they decide nobody is using it, delete it, but sometimes it is discovered the thing is an illegal copyright violation - but you have ask your lawyers what this means for you - perhaps a license is easy to get)
> Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
Honestly, I don't think this is true in the slightest. Rather, I hypothesize that people want to use such tooling and think the alternatives are slower, which I don't think is true.
If people actually did use fewer dependencies, people would have actually have websites that didn't take ages to load and were responsive.
Some years ago, I had to reproduce a neural model build that had only been done previously by a single colleague on her laptop, not using a package manager.
Part of my reproducing the build was to conduct all the library downloading in a miniconda environment, so at the end I had a reproducible recipe.
Is the original author seriously claiming that anybody was better off with the original, "pure" ad-hoc approach?
In the context of my team, us getting rid of npm wouldn't change the whole SPA ecosystem. Or the various requirements we have that effectively mandate SPA like applications.
But in the context of newer ecosystems or ones that are more tightly controlled things might be different. For example if apple massively expanded the swift standard library and made dependency management painful, iOS apps might end up having fewer dependencies.
Yes, because of human limits of time and of skills.
I remember installing software in the early 90s: download the source code, read the README, find and download the dependencies, read their READMEs, repeat a few times. Sometimes one dependency could not compile because of any incompatibility or bug. Some could be fixed, some couldn't. Often everything ended up with a successful compilation and install and in one day of work I could have what I'm getting in a few minutes now.
Actually those were small programs by today standards. My take is that we would achieve less if we have to use less dependencies.
By the way, the last time I compiled something from source was yesterday. It was openvpn3 on Debian 13, which is still unsupported. TLDR, it works but the apt-get are a little different from the ones in BUILD.md
There already is a (partial) solution to dependency hell: Nix.
It will at least massively help prevent things from breaking unexpectedly.
It won't prevent you from having to cascade a necessary upgrade (such as a security fix) across the entire project until resolution/new equilibrium is achieved.
My solution to the latter is simply to try to depend on as few things as possible. But eventually, the cancer will overtake the project if it keeps growing.
Nix isn't a solution to the problem of package managers. It just a better way to package management system, which thus makes it easier to go to dependency hell. So I'd argue it puts fuel on the flames.
The solution is just to depend on less and manage them manually.
Regardless of how they define these terms, producing a list of hashes which function as a commitment to specific versions of dependencies is a technique essential to modern software development. Whatever the tools are called, and whatever they do, they need to spit out a list of hashes that can be checked into version control.
You could just use git submodules, but in practice there are better user experiences provided by language package managers (`go mod` works great).
A good amount of this ranting can probably be attributed to projects and communities that aren't even playing the list of hashes game. They are resolving or upgrading dependencies in CI or at runtime or something crazy like that.
The argument here is (in brief) "Package management is hell, package managers are evil. So let's handle the hell manually to feel the pain better".
And honestly speaking: It is plain stupid.
We can all agree that abusing package management with ~10000 of micro packages everywhere like npm/python/ruby does is completely unproductive and brings its own considerable maintenance burden and complexity.
But ignoring the dependency resolution problem entirely by saying "You do not need dependencies" is even dumber.
Not every person is working in an environment where shipping a giant blob executable built out of vendored static dependencies is even possible. This is a privilege of the Gamedev industry has and the author forgets a bit too easily it is domain specific.
Some of us works in environment where the final product is an agglomerate of >100 of components developed by >20 teams around the world. Versioned over ~50 git repositories. Often mixed with some proprietary libraries provided by third-party providers. Gluing, assembling and testing all of that is far beyond the "LOL, just stick to the SDL" mindset proposed here.
Some of us are developing libraries/frameworks that are used embedded in >50 products with other libraries with a hell of multiples combinations of compilers / ABI / platforms. This is not something you want to test nor support without automation.
Some of us have to maintain cathedrals that are constructed over decades of domain specific knowhow (Scientific simulators, solvers, Petrol prospection tools, financial frameworks, ... ) in multiple languages (Fortran, C, C++, Python, Lua, ...) that can not just be re-written in few weeks because "I tell you: dependencies sucks, Bro"
Managing all of that manually is just insane. And generally finishes with an home-made half-baked bunch of scripts that try to badly mimic the behavior of a proper package manager.
So no, there is no replacement for a proper package manager: Instead of hating the tool, just learn to use it.
Package manager are tools, and like every tool, they should be used Wisely and not as a Maslow's Hammer.
I am not sure how you got this conclusion from the article.
> So let's handle the hell manually to feel the pain better
This is far from my position. Literally the entire point is to make it clearer you are heading to dependency hell, rather than feel the pain better whilst you are there.
I am not against dependencies but you should know the costs of them and the alternatives. Package managers hide the complexity, costs, trade-offs, and alternative approaches, thus making it easier to slip into dependency hell.
> Some of us works in environment where the final product is an agglomerate of >100 of components developed by >20 teams around the world. Versioned over ~50 git repositories. Often mixed with some proprietary libraries provided by third-party providers. Gluing, assembling and testing all of that is far beyond the "LOL, just stick to the SDL" mindset proposed here.
Does this somehow prevent you from vendoring everything?
> Does this somehow prevent you from vendoring everything?
Yes. Because in these environment soon or later you will be shipping libraries and not executable.
Shipping libraries means that your software will need to be integrated in other stacks where you do not control the full dependency tree nor the versions there.
Vendoring dependencies in this situation is the guarantee that you will make the life of your customer miserable by throwing the diamond dependency problem right in their face.
It certainly gets in the way. The more dependencies, the more work it is to update them, especially when for some reason you're choosing _not_ to automate that process. And the larger the dependencies, the larger the repo.
Would you also try to build all of them on every CI run?
What about the non-source dependencies, check the binaries into git?
This article is a bit all over the map with discussions of high trust societies and their relation to language design and software dependency management.
That said I think the final takeaway is that systems that allow you to pin versions, vendor all those dependencies and resolve/reproduce the same file tree regardless of who's machine it's on (let's assume matching architectures for simplicity here) is the goal.
Note that removing 'manually' here, this still works:
> Copying and vendoring each package {manually}, and fixing the specific versions down is the most practical approach to keeping a code-base stable, reliable, and maintainable.
The article's emphasis on the manual aspect of management of dependencies is a bit of loss, as I don't particularly believe it _has to be manual_ in the sense of manually copying files from their origin into your file tree; that certainly is a real world option, but few (myself included) would take that monk-like path again. I left this exact situation in C land and would not consider going back unless adopting something like ninja.
What the OP is actually describing is a "good" package manager feature set and many (sadly not most/all) do support this exact feature set today
PS I did chuckle when they defined evil in terms of something that gets you to dependency hell faster. However, we shouldn't be advocating for committing the same sins of our fathers.
As I age, I do everything I can to avoid "one more dependency". There's a perverse nerd bragging effect that takes place where people equate their value as a programmer to how many dependencies they can name drop and mash up into their solution. It makes sense, since we evaluate each other, and the more dependencies you can reference in your workpast, the more stepping stones you've stepped on and therefor, been longer on the path.
Anymore, as I evaluate fellow programmers, I'm looking for whether they've discovered "one more dependency" is like signing up for "one more subscription you have to remember to pay for" and what they do to try and mitigate it.
I definitely used to look for reasons to include cool new dependencies that I found, just to try and do something with cool libraries.
But as I got bit by the various issues with dependencies multiple times over the years, I have ended up preferring as few as possible and ideally zero beyond the standard library for hobby projects if I can get away with it.
Missed one of the biggest problems: Most assume the world is simpler than it really is. Not all the world is Rust, Python, Go, C++, or whatever language you advocate. I have millions of lines of C++, I'm interested in other languages, but they needs to interoperate with our C++. When we have a C++ library that does what you want I don't want you adding a new package to do the same thing as now we need to support both - sometimes will accept that using the package is the right answer, but often we need bug compatibility.
Not sure why this argument doesn't also apply to operating systems. Maybe everyone should be writing all their programs to run on a custom micro-kernel. Surely we can't trust other programmers to write something as complicated as an operating system.
There is the question. See "Reflections on Trusting trust" (a classic paper). However in the end you cannot do everything you might want to and so you must trust someone else. Operating systems are common, audited by many, and used by enough that you can have high trust they work in general (but there are some not worthy of trust). Package managers tend to contain many packages that are not in common use and if the only one who audit them might be you so you better do it yourself for each release.
If you only use a package manager for libraries that you have high trust in then you don't need to worry - but there are so few projects you can have high trust in that manual management isn't a big deal. Meanwhile there are many many potentially useful packages that can save you a lot of effort if you use them - but you need to manually audit each because if you don't nobody will and that will bite you.
I'd rather be able to update my dependencies automatically with a few commands instead of manually vendor all my dependencies, keeping up to date is really important for security. I get that game developers who only ever work on building single player games might have different opinions on "package managers", but they are in a very small niche.
One of the worst things working at companies shipping C++ was the myriad of meta-build systems that all tries to do dependency management as a part of the build system without having a separate concept of what a "package manager" is, this is truly the worst of both worlds, where people are happy to add dependencies, never update them, and never share code between projects and departments. I do not wish that way of working on my worst enemies.
Whatever problems package management brings is such a better problem to have than not having a package manager. That said I think everyone can get better at being more discriminatory of what they add to their project.
I wouldn't say I'm a dependency maximalist but it not far off.
Yes, shared code has costs
- more general than you likely need, affecting complexity, compile times, etc
- comes with risks for today (code) and the future (governace)
But the benefits are big. My theory for one of the causes for Rust having so many good cli's is Cargo because it keeps the friction low for pulling in high quality building blocks so you can better focus on your actual problem.
Instead of resisting dependencies, I think it would be better to spend time finding ways to mitigate the costs, e.g.
> My general view is that package managers (and not the things I made distinctions about) are probably in general a net-negative for the entire programming landscape, and should be avoided if possible.
I am arguing that their "benefits" are only very short-term, if there is actually benefit for them in the first place. The strawman that you present has been repeated already and is not considering that all of those other things are actually useful and good alternatives.
I'm very thankful for the Debian team's efforts to include most of my most commonly software packages in their repo. Out of all the differences between me and my colleagues workflows on MacOS and windows, this is the most impactful one. I don't remember the last time I had any kinds of dependency issues. I keep updating my packages when I log on and there are no version and/or dependency issues whatsoever.
> In real life, when you have a dependency, you are responsible for it. If the thing that is dependent on you does something wrong, like a child or business, you might end up in jail, as you are responsible for that.
Isn't this backwards? In real life, if you have a dependent, you are responsible for it. On the other hand, if you have a dependency on something, you rely on that thing, in other words it should be responsible for you. A package that is widely used in security-critical applications ought to be able to be held accountable if its failure causes harm due to downstream applications. But because that is in general impossible and most library authors would never take on the risk of making such guarantees, the risk of each dependency is taken on by the person who decides it is safe to use it, and I agree package managers sometimes make that too easy.
I mean, sure. So what does the solution look like? From my perspective it looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager.
> JavaScript is great example of this as there are multiple different package managers for the language (npm being one of the most popular), but because each package manager defines the concept of a package differently, it results in the need for a package manager manager.
This doesn't seem like a strong point to me. Yes, there are things like yarn, pnpm, etc. But IIUC practically all npm alternatives still define packages in the same way (a package.json at the root hosted by npmjs (or your private repo)), and the differences are ergonomic/performance related.
> [that each package manager defines the concept of a package differently] is why I am saying it is evil, as it will send you to hell quicker.
Then I think it's more of a language problem, not a problem with the concept of a package manager.
> t looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager.
If only it where that easy.
Often the update isn't source compatible with the package that uses it so you can't update. There are some projects I use that I can't update because I use 6 different plugins, and each updates to the main project on a different schedule on their own terms - meaning the only version I can use is 10 years out of date and there appears no chance they will all update. (if this was critical I'd update it myself, but there are always more important things to work on so I never will in practice)
Sometimes a package will change license and you need to check the legalese before you update.
Sometimes a package is hijacked (see xv) and so you really should be doing an audit of every update you apply.
I agree with all of the problems that you're highlighting, but would say that all of those problems exist whether or you're doing manual dependency management or using a package manager.
The solution IMO (which is non-existent afaik) would be to integrate some kind of third party auditing service into package managers. For example, for your npm project you could add something like this to your package.json:
`
"requireAuditors": [
{
"name": "microsoft-scanning-service",
"url": "https://npmscanner.microsoft.com/scanner/",
"api_key": "yourkeyhere, default to getting it from .env"
}
]
`
And when you npm install, the version / hash is posted to all of your required auditor's urls. npm should refuse to install any version that hasn't been audited. You can have multiple auditing services defined, maybe some of them paid/able to scan your own internal packages, etc.
I've thought about building a PoC of this myself a couple of times because it's very much on my mind, but haven't spent any time on it and am not really positioned to advocate for such a service.
Yeah, yarn and co came about because npm was slow, buggy and didn't honor its own lockfile.
Nowadays it is mostly improved, and the others differentiate by enchantments to workspaces (better monorepo support) or more aggressive caching by playing games with where the installed packages physically exist on the system.
The core functionality- what a package is- has always been the same across the package managers though, because the runtime behavior is defined by node, not the package manager.
There are no solutions, only trade-offs. And the point is that not everything needs to be, nor ought to be, automated. And package managers are a good point of this.
And yes, a language with an ill-defined concept of a package in the language itself is a problem of the language, but the package managers are not making it any better.
> And yes, a language with an ill-defined concept of a package in the language itself is a problem of the language, but the package managers are not making it any better.
If a language does not provide a definition of a package but a package manager _does_, then I would say that that package manager did make that aspect of the problem better.
"I mean, sure. So what does the solution look like? From my perspective it looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager."
Exactly! Who has the time or the discipline to do that manually?
The post goes on to say that random packages are not necessarily better than what members of your team could make. At the end it gets to:
> Through manual dependency management. Regardless of the language, it is a very good idea that you know what you are depending on in your project. Copying and vendoring each package manually, and fixing the specific versions down is the most practical approach to keeping a code-base stable, reliable, and maintainable. Automated systems such as generic package managers hide the complexity and complications in a project which are much better not hidden away.
So that makes all of us human package managers. It's also true that you can get a package manager from internet folk that works better than the processes and utilities your team cobbles together to ease the burden.
I’ve had major Nissan Altima effect with this lately. A few weeks ago I set out to make a simple C and C++ package manager that just ignores dependency hell in favor of you explicitly specifying packages. And no binaries, just build from source and use Git as a backend for managing what source maps to what builds.
Plus Lua for package recipes. It’s going really well!
I don't find the argument very convincing. If anything you should have more time to carefully vet your dependencies when you don't need to spend time manually doing the tedious bit. Making things more difficult than they need to be just to introduce friction is a ridiculous proposition.
Package managers are constraint solvers. You could manually figure out if XYZ shared library works with somebody else's code, or, you could expect that they would label the range of shared library versions their code needs.
There are three points of prioritization here: you can use other peoples' code, manually vet all the code you're running, or accept that you need to trust a social network to vet stuff for you. Pick two. This is not a solvable problem.
EDIT: I've been rate limited, so the point is: unless you're Terry Davis, you're not going to be able to write software of any real complexity. Few people are going to even bother to vet the standard library, let alone the compiler, the runtime, etc etc.
if only people knew how easy it is to implement the standard library and make it way simpler than what is usually provided, everyone would be writing their own standard libraries; you can implement one with string manipulation, files, memory management, threading, and basic timing, in less than 1000loc of c code, as i have done before, and the biggest parts by far were console printing and filesystem stuff, and it's mostly because of windows utf-16 conversion nonsense
Honestly, he's not wrong. I use Ruby and 99% of the gems on rubygems.org are absolute trash. I use Rails and stuff like Nokogiri or Faraday, also RubyLLM, but little else because of reasons.
NPM is even worse, you import one thing and get 1000s of trash libraries so nowadays the only JS I write is vanilla and I import ES Modules manually.
Also, Odin doesn't make adding dependencies that difficult, you can literally just throw an Odin library into your project as a folder and it's available. The Odin compiler does everything else for you.
It's impossible to know what issues they have, since they don't specify.
But no, for the vast majority of people, SDL2 is perfectly fine, although SDL3 is a vast improvement. It's as stable and battle-tested as a cross platform multimedia library is bound to get. Opening a window and polling input is trivial.
Then again I've never even heard of the language they're using (Odin) so maybe that doesn't play well with a C library.
"When using Go for example, you don’t need any third-party libraries to make a web server, Go has it all there and you are done."
Fine, now what if you need to connect to a database, or parse a PDF, or talk to a grpc backend. What a hilariously short-sighted example.
To me, this whole article just screams inexperience.
The Author isn't arguing for not using third party dependencies. He's arguing for developers to be more conscious of the dependencies they use, by manually vetting and handling them. That screams "I've been down the package manager route and paid the price". Not inexperience.
> He's arguing for developers to be more conscious of the dependencies they use
"be careful all the time" doesn't scale. Half of all developers have below-average diligence, and that's a low bar. No-one is always vigilant, don't think that you're immune to human error.
No, you need tooling, automation to assist. It needs to be supported at the package manager side. Managing a site where many files are uploaded, and then downloaded many times is not a trivial undertaking. It comes with oversight responsibilities. If it's video you have to check for CSAM. If it's executable code, then you have to check for malware.
Package managers are not evil, but they are a tempting target and need to be secured. This can't just be an individual consumer responsibility.
I can't speak for other ecosystems, but some NuGet measures are here:
https://devblogs.microsoft.com/dotnet/building-a-safer-futur...
https://learn.microsoft.com/en-us/nuget/concepts/security-be...
I believe that there have been (a few) successful compromises of packages in NuGet, and that these have been mitigated. I don't know how intense the arms race is now.
10 replies →
But titled the post "package managers are evil"
16 replies →
I disagree with this take. There should be just more governance on the registry side of things.
For NuGet or Maven I think dependency hell is not something you run into and I don’t have package manager manager for those languages.
There should be enough trust just like I can do sudo apt install.
His take screams „I want to push my niche approach and promote my language from my Ivory Tower of language creator”. He still might not have any relevant experience building businesses line software just like O don’t have experience with building compilers or languages.
Inexperience of an author who develops quite successful programming language for like 10 years? Quite a bold statement.
Actually his perspective is quite reasonable. Go is in the other part of the spectrum than languages encouraging "left-pad"-type of libraries, and this is a good thing.
Not to mention we've have had decades of software development without automated package managers and people did just fine.
I've seen plenty of intelligent people acting pretty stupid.
As my psychology professor used to say. "Smart is how efficiently use your intelligence. Or don't."
So someone pretty low IQ can be smart - Forrest Gump. Or someone high IQ can be dumb occasionally - a professor so very attuned to his research topic at expense of everything else.
14 replies →
Is it "quite successful"? How would I distinguish such a "quite successful" language from say Hare or V or are these all "successful" in your mind?
7 replies →
To me, this whole comment just screams inability to steelman.
Sure... and, to prove your point, Go has a package manager too (although it's a relatively new addition). But Go still follows a "batteries included" approach, where "standard" stuff (yes, even database handling) is handled by the standard library. Which still leaves lots of other things for which you need third party packages, but those will be typically far fewer than in other languages.
I think the argument presented, is that whatever a Go package does, it does well.
Btw the Js ecosystem also has quite a few good packages (and a ton of terrible ones, including some which everyone seems to consider as the gold standard).
I don't see the value in making it even harder to build software. I want to make things. Downloading a dependency manually and then cursing at the compiler because "it's right there! why won't it load!!" is just gonna make me want to build software less.
Anyone I want to work with on a project is going to have to have the same frustration and want to work on the project less. Only even more because you see they downloaded version 2.7.3-2 but the version I use is 2.7.3-1.
> Downloading a dependency manually and then cursing at the compiler because "it's right there! why won't it load!!"
Odin's compiler knows what a package is and will compile it into your program automatically.
Isn't that a (built-in) package manager if it works for general packages? Or does it work only for selected dependencies?
19 replies →
This is an argument for a good build system, not a package manager.
These aren’t always separate.
Some distos might try to support multiple versions of a library. That could require installing it to different prefixes instead of the default. Thus, the build system will have to comprehend that.
4 replies →
Build systems are yet another special circle of hell.
This article, although is trying to provide some arguments as for why package managers are "evil", I found the argumentation pretty weak/non-descriptive. It's good if you have the experiences that confirm a specific point of view, but I think these experiences need to be explained in some more detail, because people reading your article may have similar experiences and therefore would find it hard to agree with your points - just like me.
To give a concrete example, you said that javascript does not have a definition of a "package" in its langauge. But what does that really mean, and why should it lead to package manager managers? Because for me, a person who has worked with javascript just a little bit, I know package.json exists and most of the package managers I've worked with agree on what the contents of this file mean. If we limit our understanding to just npm, yarn and probably bun, we don't see how that causes or contributes to the dependency hell problem (sure it exists, but how?).
You said that Go mitigates the issue of dependency hell to some degree, but this is an interesting thought, give it more exploration! Why should something like Go not have this problem not be not as severe as in Javascript?
I may not remember the details of what you said in the article and I would like to check, but currently I can't access the site because it times-out for me.
package.json is a convention, not a language definition, hence package managers may implement "package" management differently; in reality conventions are followed, until they aren't, and that's where hell begins (if something can be abused - it will be); go and odin define package in the language itself as a folder containing source files, and they mitigate many management issues by just having a good standard library, so you wouldn't need as many packages to begin with;
In general, I think the dependency hate is overblown. People hear about problems with dependencies because dependencies are usually open source code used by a lot of people so it is public and relevant. You don't hear as much about problems in the random code of one particular company unless it ends up in a high profile leak. For example, something like the heartbleed bug was a huge deal and got a lot of press, but imagine how many issues we would be in if everyone was implementing their own SSL. Programmers often don't follow best practices when they do things on their own. That is how you end up with things like SQL injection attacks in 2025.
Dependencies do suck but it is because managing a lot of complicated code sucks. You need some way to find issues over time and keep things up to date. Dependencies and package managers at least offer us a path to deal with problems. If you are managing your own dependencies, which I imagine would mean vendoring, then you aren't going to keep these dependencies up to date. You aren't going to find out about exploits in the dependencies and apply them.
> imagine how many issues we would be in if everyone was implementing their own SSL.
No, the alternative is to imagine how many issues we would be in if every project pulled in 5 different SSL libraries. Having one that everybody uses and that is already installed on everyone's system is avoiding dependency hell. Even better if it's in stdlib.
I see this a lot with Rust where I will depend on one or two external crates for a simple application and then I am shocked to see dozens of dependencies being pulled in when I go to build. I actually think Cargo's support for feature gates and conditional compilation could in theory be a strong mitigation against this as crates can avoid pulling in dependencies unless you actually need a feature that relies on them, but in practice it doesn't seem to work that way as I often see these complaints about Rust.
I sympathise with the arguments but IMO laziness will always win out. If Rust didn't have Cargo to automate dependency hell, someone would create a third party script to fill the gap.
It is an organizational not a technical problem.
When I worked at Google every single dependency was strictly vendored (and not in the mostly useless way that Cargo vendors things). There was generally only one version of a dep in the mono repo, and if you wanted something.. you generally got to own maintaining it, and you had to make sure it worked for every "customer" -- the giant CI system made sure that you knew if an upgrade would break things. And you reached out to stakeholders to manage the process. Giant trains of dependencies were not a thing. You can do that when you have seemingly infinite budget.
But technology can indeed make it worse. I love Rust, but I'm not a fan of the loose approach in Cargo and esp Crates.io, which seems to have pulled inspiration from NPM -- which I think is more of a negative than positive example. It's way too easy to make a mess. Crates.io is largely unmoderated, and its namespace is full of abandoned or lightly maintained projects.
It's quite easy to get away with a maze of giant transitive deps w/ Cargo because Rust by default links statically, so you don't usually end up in DLL hell. But just doing cargo tree on the average large Rust project is a little depressing -- to see how many separate versions of random number generators, SHA256, MD5, etc libs you end up with in a single linkage. It may not be the case that every single one is contributing to your binary size... but it's also kind of hard to know.
Understanding the blast radius of potential issues that come from unmoderated 3rd-party deps is I think something that many engineers have to learn the hard way. When they deal with a security vulnerability, or a fundamental incompatibility issue, or have to deal with build time and binary size explosions.
I wish there was a far more mature approach to this in our industry. The trend seems to be going in the opposite direction.
In many ways traditional Linux distros operate on similar model as I imagine googles monorepo. Both aim to this "globally consistent" dependency situation where you have one version of each library and you patch up things from upstream when they don't fit.
I feel we need more of these kinds of distros so you don't need to manage dependencies directly from upstream and deal with the integration effort yourself. What if we had a Rust disto following this same model, where there is only one version of each dep, some reasonable curation, and also you had nice clear release cycles? I feel that could real boon for the industry.
1 reply →
> If Rust didn't have Cargo to automate dependency hell, someone would create a third party script to fill the gap.
Possibly but not guaranteed. Some other languages without a built in package manager haven't had an external one manage to take over the ecosystem, most (in)famously C and C++, while others have.
Most language users will follow the "spirit" of the language - e.g. Bill is against package managers, people who use his language mostly agree with his ideas, and there's not a huge standard Odin package manager.
I rather appreciate that C and C++ don't have a default package manager that took over - yes, integrating libraries is a bit more difficult, but we also have a lot of small, self-contained libraries that just "do the thing" without pulling in a library that does colored text for logging, which pulls in tokio, which pulls in mio, which pulls in wasi, which pulls in serde, which is insane.
2 replies →
The package manager for C/C++ is apt, or rpm, or whatever package manager your system uses. These package managers were designed for the world of C/C++ software so it's less surprising that these languages haven't found as much of a push towards language package managers.
4 replies →
Rust’s big issue here is the anemic standard library. I think overall the strategy makes some amount of sense; since there’s so much crazy alchemy like depending on nightly, no_std, etc in Rust, including stuff in std has more downside in Rust than in a language that’s more stable like Go.
But it’s annoying to have to deal with 3 different time libraries and 3 different error creation libraries and 2 regex libraries somehow in my dependency tree. Plus many packages named stuff like “anyhow” or “nom” or other nonsense words where you need to google for a while to figure out what a package is supposed to do. Makes auditing more difficult than if your library is named structured-errors or parser-combinator.
I don’t like go programming language but I do like go tooling & go ecosystem. I wish there was a Rust with Go Principles. Swift is kinda in the right ballpark, packages are typically named stuff that makes sense and Swift is closer to Rust perf and Rust safety than Go perf and Go safety. But Swift is a tiny ecosystem outside of stuff that depends on the Apple proprietary universe, and the actual APIs in packages can be very magical/clever. ¯\_(ツ)_/¯
The very sparse std is one of the few genuine mistakes I think Rust has made. I know the arguments for it, but I don't find them persuasive. A batteries included standard library, in my view, is just plain better and every modern language should have one.
6 replies →
I agree, though also I note Python has an extensive standard library and isn't much better in terms of package sprawl.
1 reply →
Just as the Rust community has largely converged on tokio as the standard async runtime, is there any reason why there couldn't exist a community-developed "batteries-included" standard library other than writing a standard library being a tedious and thankless task?
> How do I manage my code without a “package manager”? [...] Through manual dependency management.
Slackware Linux does precisely that.
I'm a Slackware user. Slackware does have a package manager that can install or remove packages, and even a frontend that can use repositories (slackpkg), but it does have manual dependency resolution. Sure, there are 3rd-party managers that can add dependency resolution, but they do not come with the distro as default.
This is a very personal opinion, but manual dependency management is a feature. Back in the day, I remember installing Mandrake Linux 9.2 and activating the (then new-ish) framebuffer console. The distro folks had no better idea than to force a background "9.2" image on framebuffer consoles, which I hated. I finally found the package responsible for that. Removing it with urpmi, however, meant removing all the graphical desktop components (including X11) because that stupid package was listed as a dependency of everything graphical.
That prompted me to seek alternatives to Mandrake and ended up using Slackware. Its simplicity had the added bonus of offering manual dependency resolution.
Sounds like "alias dpkg=dpkg --force-depends"?
Perhaps; I'm not really knowledgeable on the ways of Debian.
> Dependency hell [0] is a real thing which anyone who has worked on a large project has experienced. Projects having thousands, if not tens of thousands, of dependencies where you don’t know if they work properly, where are the bugs, you don’t how anything is being handled—it’s awful.
[0] https://en.wikipedia.org/wiki/Dependency_hell
I find it strange that they use a term with a common meaning, link to that meaning, and then talk about something else?
Same. The first thing I thought was "wait a second, that isn't dependency hell".
The second thing is that their version of dependency hell - having lots of dependencies introducing lots of bugs that you would not have written - is not my experience. 99% of the time, my bugs are in my own code, lol. Maybe once you become a much better programmer than me, you stop writing bugs in your own code and instead start having to deal with bugs in the PNG parsing library you depend on or something, and at that point writing your own PNG parsing library becomes a good use of your time. But I'm certainly not at that point.
I've had to fix bugs in dependencies of course. Here is one I fixed yesterday [0]. But it's much closer to the exception than the rule.
[0]: https://github.com/sanity/pav.rs/pull/4
Where?
I don't know what the solution to this problem is, but I do remember a time (around 20 years ago) when this wasn't a big problem. Was working on a fairly large (each module between 50k - 100k LOC) C++ system. The process for using libraries:
1) Have problem that feels too complicated to hand-code.
2) Go on Internet/forums, find a library. The library is usually a small, flat collection of atomic functions.
3) A senior engineer vets the library and approves it for use.
4) Download the stable version: header file, and the lib file for our platform (on rare occasions, build it from source)
5) Place the .h file in the header path, and the lib file in the lib path; update the Makefile.
6) #include the header and call functions.
7) Update deployment scripts (bash script) to scp the lib file to target environment, or in some cases, use static linking.
8) Subscribe to a mailing list and very occasionally receive news of a breaking change that requires a rebuild.
This may sound like a lot of work, but somehow, it was a lot less stressful than dealing with NPM and node_modules today.
I think the main thing that makes this workable is "The library is usually a small, flat collection of atomic functions."
I find that it's the hell of transitive dependencies--you as a developer can reasonably vet a single layer of 10-30 standalone libraries. But if those libraries depend on other libraries, etc, then it balloons into hundreds or thousands of dependencies, and then you're sunk.
For what it's worth, I don't think much of this is essential complexity. Often a library is complicated because it supports 10 different ways of using it, but when you use the library, you're only using 1 of those ways. If everyone is only using 10% of thousands of transitive dependencies, the overall effect is incredibly complicated, but could have been achieved with 10-100% more short-term effort. Sure, "it took twice as long to develop but at least we don't have 10x the dependencies" is a hard sell to management (and often to ourselves), but that's because we usually choose to ignore the costs of depending on software we don't understand and don't control. We think that we're cleverly avoiding having to maintain and secure those libraries we outsourced, but most open-source developers aren't doing a great job of that anyway.
Often it really is easier to develop something from scratch, rather than learn and integrate a library. Not always though, of course.
In C and C++ you don't need the transitive dependencies for compilation, you only need the header of the direct dependencies. As for linking they are only needed when linking dynamically, which was much less prevalent 20 years ago.
2 replies →
This reads much more like a critique of traditional open-source development than package managers themselves.
The author asserts that most open-source projects don't hit the quality standards so that their libraries can be just included, and they'll do what they say.
I assert that this is because there's no serious product effort behind most libraries (as in no dedicated QA/test/release cycle), no large commercial products use it (or if they do, either they do it in a very limited fashion, or just fork it).
Hobbyists do QA as long as it interests them/fits their usecase, but only the big vendors do bulletproof releases (which in the desktop realm seems to be only MS/Apple)
This might have to do with the domain the author chose - desktop development has unfortunately had the life sucked out of it with every dev either being a fullstack/cloud/ML/mobile dev, its mindshare and the resources going toward it have plummeted.
(I also have a sneaking suspicion the author might've encountered those bugs on desktop Linux, which, despite all the cheerleading (and policing negative opinions), is as much as a buggy mess as ever.
In my experience, it's quite likely to run into a bug that nobody has written about on the internet ever.
This critique applies to even closed-source development that uses open-source code bases.
I have an article on my unstructured thoughts on the problems of OSS/FOSS which goes into more depth about this: https://www.gingerbill.org/article/2025/04/22/unstructured-t...
This is why I'm so glad that I work in a closed monorepo now. There is no package management, only build tooling.
I find myself nodding along to many of the technical and organizational arguments. But I get lost in the licensing discussion.
If it is a cultural problem that people insist on giving things away for free (and receiving them for free), then viral licenses can be very helpful, not fundamentally pernicious.
Outside of the megaprojects, my mental model for GPL is similar to proprietary enterprise software with free individual licenses. The developer gets the benefits of open projects: eyeballs, contributors, adoption, reputational/professional benefits, doing a good deed (if that motivates them) while avoiding permissively giving everything away. The idea that it's problematic that you can't build a business model on their software is akin to the "forced charity" mindset—"why did you make something that I can't use for free?"
If you see a GPL'd bit of code that you really want to use in your business, email the developers with an offer of $X,000 for a perpetual commercial license and a $Y,000/yr support contract. Most are not so ideologically pure to refuse. It's a win-win-win: your business gets the software, the developers don't feel exploited, noncommercial downstream users can enjoy the fruits of open software, and everybody's contributed to a healthier attitude on open source.
> "This is the automation of dependency hell. The problem is that not everything needs to be automated, especially hell. Dependency hell is a real thing which anyone who has worked on a large project has experienced. Projects having thousands, if not tens of thousands, of dependencies where you don’t know if they work properly, where are the bugs, you don’t how anything is being handled—it’s awful.
This the wrong thing to automate. You can do this manually, however it doesn’t stop you getting into hell, rather just slow you down, as you can put yourself into hell (in fact everyone puts themselves into hell voluntarily). The point is it makes you think how you get there, so if you have to download manually, you will start thinking “maybe I don’t want this” or “maybe I can do this instead”. And when you need to update packages, being manual forces you to be very careful."
I sympathise with this, but I have to respond that we have to live within existing ecosystems. Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
Packages themselves are not bad. NPM is just fine - so long as you don't let it do dependency resolution and lock the version of every package. Note that this means you have to get notified when each package is updated (how!) and make a decision on how to update it (or if you decide not to update make a decision to maintain it).
The other thing is your package manager cannot go out to the internet randomly. You need it to download from a place you are comfortable with (which might or might not be the default) existing as long as you need packages, and that will keep the versions of packages you want around. If you are a company project that means an internal server/mirror because otherwise something you depend on will disappear in the future. (most of they decide nobody is using it, delete it, but sometimes it is discovered the thing is an illegal copyright violation - but you have ask your lawyers what this means for you - perhaps a license is easy to get)
> Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
Honestly, I don't think this is true in the slightest. Rather, I hypothesize that people want to use such tooling and think the alternatives are slower, which I don't think is true.
If people actually did use fewer dependencies, people would have actually have websites that didn't take ages to load and were responsive.
So the existing ecosystems are just bad.
Some years ago, I had to reproduce a neural model build that had only been done previously by a single colleague on her laptop, not using a package manager.
Part of my reproducing the build was to conduct all the library downloading in a miniconda environment, so at the end I had a reproducible recipe.
Is the original author seriously claiming that anybody was better off with the original, "pure" ad-hoc approach?
> Getting rid of npm and doing things manually won't make building SPAs have fewer dependencies, build would be incredibly slow and painful.
You don't think making adding dependencies incredibly slow and painful would make people have fewer of them?
In the context of my team, us getting rid of npm wouldn't change the whole SPA ecosystem. Or the various requirements we have that effectively mandate SPA like applications.
But in the context of newer ecosystems or ones that are more tightly controlled things might be different. For example if apple massively expanded the swift standard library and made dependency management painful, iOS apps might end up having fewer dependencies.
You would decrease number of dependencies yes. However your dependencies or your code would then become huge.
Same number of lines but in fewer dependencies.
Yes, because of human limits of time and of skills.
I remember installing software in the early 90s: download the source code, read the README, find and download the dependencies, read their READMEs, repeat a few times. Sometimes one dependency could not compile because of any incompatibility or bug. Some could be fixed, some couldn't. Often everything ended up with a successful compilation and install and in one day of work I could have what I'm getting in a few minutes now.
Actually those were small programs by today standards. My take is that we would achieve less if we have to use less dependencies.
By the way, the last time I compiled something from source was yesterday. It was openvpn3 on Debian 13, which is still unsupported. TLDR, it works but the apt-get are a little different from the ones in BUILD.md
There already is a (partial) solution to dependency hell: Nix.
It will at least massively help prevent things from breaking unexpectedly.
It won't prevent you from having to cascade a necessary upgrade (such as a security fix) across the entire project until resolution/new equilibrium is achieved.
My solution to the latter is simply to try to depend on as few things as possible. But eventually, the cancer will overtake the project if it keeps growing.
Source: Have worked on a million-LOC Ruby app
Nix isn't a solution to the problem of package managers. It just a better way to package management system, which thus makes it easier to go to dependency hell. So I'd argue it puts fuel on the flames.
The solution is just to depend on less and manage them manually.
4 replies →
There's a fair bit of semantic quibbling here.
Regardless of how they define these terms, producing a list of hashes which function as a commitment to specific versions of dependencies is a technique essential to modern software development. Whatever the tools are called, and whatever they do, they need to spit out a list of hashes that can be checked into version control.
You could just use git submodules, but in practice there are better user experiences provided by language package managers (`go mod` works great).
A good amount of this ranting can probably be attributed to projects and communities that aren't even playing the list of hashes game. They are resolving or upgrading dependencies in CI or at runtime or something crazy like that.
The semantic quibbling is just to explain what a package manager isn't.
Also, use git subtrees, not git submodules. What people think submodules are, are actually subtrees and most people don't know about them.
As for "good" package managers, they are still bad because of what I said in the article.
The argument here is (in brief) "Package management is hell, package managers are evil. So let's handle the hell manually to feel the pain better".
And honestly speaking: It is plain stupid.
We can all agree that abusing package management with ~10000 of micro packages everywhere like npm/python/ruby does is completely unproductive and brings its own considerable maintenance burden and complexity.
But ignoring the dependency resolution problem entirely by saying "You do not need dependencies" is even dumber.
Not every person is working in an environment where shipping a giant blob executable built out of vendored static dependencies is even possible. This is a privilege of the Gamedev industry has and the author forgets a bit too easily it is domain specific.
Some of us works in environment where the final product is an agglomerate of >100 of components developed by >20 teams around the world. Versioned over ~50 git repositories. Often mixed with some proprietary libraries provided by third-party providers. Gluing, assembling and testing all of that is far beyond the "LOL, just stick to the SDL" mindset proposed here.
Some of us are developing libraries/frameworks that are used embedded in >50 products with other libraries with a hell of multiples combinations of compilers / ABI / platforms. This is not something you want to test nor support without automation.
Some of us have to maintain cathedrals that are constructed over decades of domain specific knowhow (Scientific simulators, solvers, Petrol prospection tools, financial frameworks, ... ) in multiple languages (Fortran, C, C++, Python, Lua, ...) that can not just be re-written in few weeks because "I tell you: dependencies sucks, Bro"
Managing all of that manually is just insane. And generally finishes with an home-made half-baked bunch of scripts that try to badly mimic the behavior of a proper package manager.
So no, there is no replacement for a proper package manager: Instead of hating the tool, just learn to use it.
Package manager are tools, and like every tool, they should be used Wisely and not as a Maslow's Hammer.
I am not sure how you got this conclusion from the article.
> So let's handle the hell manually to feel the pain better
This is far from my position. Literally the entire point is to make it clearer you are heading to dependency hell, rather than feel the pain better whilst you are there.
I am not against dependencies but you should know the costs of them and the alternatives. Package managers hide the complexity, costs, trade-offs, and alternative approaches, thus making it easier to slip into dependency hell.
> I am not against dependencies but you should know the costs of them and the alternatives.
You are against the usage of a tool and you propose no alternative.
Handling the dependency by vendoring them manually, like you propose in your blog, is not an alternative.
This is an over simplification of the problem (and the problem is complex) that can be applied only to your specific usage and domain.
1 reply →
I mostly agree, but
> Some of us works in environment where the final product is an agglomerate of >100 of components developed by >20 teams around the world. Versioned over ~50 git repositories. Often mixed with some proprietary libraries provided by third-party providers. Gluing, assembling and testing all of that is far beyond the "LOL, just stick to the SDL" mindset proposed here.
Does this somehow prevent you from vendoring everything?
> Does this somehow prevent you from vendoring everything?
Yes. Because in these environment soon or later you will be shipping libraries and not executable.
Shipping libraries means that your software will need to be integrated in other stacks where you do not control the full dependency tree nor the versions there.
Vendoring dependencies in this situation is the guarantee that you will make the life of your customer miserable by throwing the diamond dependency problem right in their face.
13 replies →
It certainly gets in the way. The more dependencies, the more work it is to update them, especially when for some reason you're choosing _not_ to automate that process. And the larger the dependencies, the larger the repo.
Would you also try to build all of them on every CI run?
What about the non-source dependencies, check the binaries into git?
This article is a bit all over the map with discussions of high trust societies and their relation to language design and software dependency management.
That said I think the final takeaway is that systems that allow you to pin versions, vendor all those dependencies and resolve/reproduce the same file tree regardless of who's machine it's on (let's assume matching architectures for simplicity here) is the goal.
Note that removing 'manually' here, this still works:
> Copying and vendoring each package {manually}, and fixing the specific versions down is the most practical approach to keeping a code-base stable, reliable, and maintainable.
The article's emphasis on the manual aspect of management of dependencies is a bit of loss, as I don't particularly believe it _has to be manual_ in the sense of manually copying files from their origin into your file tree; that certainly is a real world option, but few (myself included) would take that monk-like path again. I left this exact situation in C land and would not consider going back unless adopting something like ninja.
What the OP is actually describing is a "good" package manager feature set and many (sadly not most/all) do support this exact feature set today
PS I did chuckle when they defined evil in terms of something that gets you to dependency hell faster. However, we shouldn't be advocating for committing the same sins of our fathers.
As I age, I do everything I can to avoid "one more dependency". There's a perverse nerd bragging effect that takes place where people equate their value as a programmer to how many dependencies they can name drop and mash up into their solution. It makes sense, since we evaluate each other, and the more dependencies you can reference in your workpast, the more stepping stones you've stepped on and therefor, been longer on the path.
Anymore, as I evaluate fellow programmers, I'm looking for whether they've discovered "one more dependency" is like signing up for "one more subscription you have to remember to pay for" and what they do to try and mitigate it.
I definitely used to look for reasons to include cool new dependencies that I found, just to try and do something with cool libraries.
But as I got bit by the various issues with dependencies multiple times over the years, I have ended up preferring as few as possible and ideally zero beyond the standard library for hobby projects if I can get away with it.
Missed one of the biggest problems: Most assume the world is simpler than it really is. Not all the world is Rust, Python, Go, C++, or whatever language you advocate. I have millions of lines of C++, I'm interested in other languages, but they needs to interoperate with our C++. When we have a C++ library that does what you want I don't want you adding a new package to do the same thing as now we need to support both - sometimes will accept that using the package is the right answer, but often we need bug compatibility.
Not sure why this argument doesn't also apply to operating systems. Maybe everyone should be writing all their programs to run on a custom micro-kernel. Surely we can't trust other programmers to write something as complicated as an operating system.
There is the question. See "Reflections on Trusting trust" (a classic paper). However in the end you cannot do everything you might want to and so you must trust someone else. Operating systems are common, audited by many, and used by enough that you can have high trust they work in general (but there are some not worthy of trust). Package managers tend to contain many packages that are not in common use and if the only one who audit them might be you so you better do it yourself for each release.
If you only use a package manager for libraries that you have high trust in then you don't need to worry - but there are so few projects you can have high trust in that manual management isn't a big deal. Meanwhile there are many many potentially useful packages that can save you a lot of effort if you use them - but you need to manually audit each because if you don't nobody will and that will bite you.
What about the CPU microcode? Can’t trust that either. :)
I'd rather be able to update my dependencies automatically with a few commands instead of manually vendor all my dependencies, keeping up to date is really important for security. I get that game developers who only ever work on building single player games might have different opinions on "package managers", but they are in a very small niche.
One of the worst things working at companies shipping C++ was the myriad of meta-build systems that all tries to do dependency management as a part of the build system without having a separate concept of what a "package manager" is, this is truly the worst of both worlds, where people are happy to add dependencies, never update them, and never share code between projects and departments. I do not wish that way of working on my worst enemies.
Whatever problems package management brings is such a better problem to have than not having a package manager. That said I think everyone can get better at being more discriminatory of what they add to their project.
I wouldn't say I'm a dependency maximalist but it not far off.
Yes, shared code has costs
- more general than you likely need, affecting complexity, compile times, etc
- comes with risks for today (code) and the future (governace)
But the benefits are big. My theory for one of the causes for Rust having so many good cli's is Cargo because it keeps the friction low for pulling in high quality building blocks so you can better focus on your actual problem.
Instead of resisting dependencies, I think it would be better to spend time finding ways to mitigate the costs, e.g.
- I'd love for crates.io to integrate diff.rs, provenance reporting (https://lawngno.me/blog/2024/06/10/divine-provenance.html), etc
- More direct support for security checking in cargo
- Integrating cargo-vet and/or cargo-crev into cargo
> The problem is that not everything needs to be automated, especially hell.
What a great quote.
> People abuse cars and ram into crowds, so let's not have cars in our city!
> People abuse knives and stab people, so let's not have knives in our kitchen!
> People abuse package managers and create dependency hells, so let's not have package managers in our programming language!
No matter how you see it, this fits the definition of dumbing down;
Is this what you really want?
If that is the case, then we can shake hands and I will use a different programming language.
> My general view is that package managers (and not the things I made distinctions about) are probably in general a net-negative for the entire programming landscape, and should be avoided if possible.
I am arguing that their "benefits" are only very short-term, if there is actually benefit for them in the first place. The strawman that you present has been repeated already and is not considering that all of those other things are actually useful and good alternatives.
The obvious benefit to me are the automations, I have directly experienced these time benefits
I read you think these automations lead to more harm than good so are a net negative, and I understand that point of view
However, I think this is not a tautology
I think dependency hells and bad dependencies absolutely happen indeed, but are conditioned to badly managed programming projects
And I do not want to suffer from the dumbing down of stripping out package managers
1 reply →
If you've ever lived in a place with a localized car ban, it's awesome.
I'm very thankful for the Debian team's efforts to include most of my most commonly software packages in their repo. Out of all the differences between me and my colleagues workflows on MacOS and windows, this is the most impactful one. I don't remember the last time I had any kinds of dependency issues. I keep updating my packages when I log on and there are no version and/or dependency issues whatsoever.
> In real life, when you have a dependency, you are responsible for it. If the thing that is dependent on you does something wrong, like a child or business, you might end up in jail, as you are responsible for that.
Isn't this backwards? In real life, if you have a dependent, you are responsible for it. On the other hand, if you have a dependency on something, you rely on that thing, in other words it should be responsible for you. A package that is widely used in security-critical applications ought to be able to be held accountable if its failure causes harm due to downstream applications. But because that is in general impossible and most library authors would never take on the risk of making such guarantees, the risk of each dependency is taken on by the person who decides it is safe to use it, and I agree package managers sometimes make that too easy.
> Each dependency is a potential liability.
I mean, sure. So what does the solution look like? From my perspective it looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager.
> JavaScript is great example of this as there are multiple different package managers for the language (npm being one of the most popular), but because each package manager defines the concept of a package differently, it results in the need for a package manager manager.
This doesn't seem like a strong point to me. Yes, there are things like yarn, pnpm, etc. But IIUC practically all npm alternatives still define packages in the same way (a package.json at the root hosted by npmjs (or your private repo)), and the differences are ergonomic/performance related.
> [that each package manager defines the concept of a package differently] is why I am saying it is evil, as it will send you to hell quicker.
Then I think it's more of a language problem, not a problem with the concept of a package manager.
> t looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager.
If only it where that easy.
Often the update isn't source compatible with the package that uses it so you can't update. There are some projects I use that I can't update because I use 6 different plugins, and each updates to the main project on a different schedule on their own terms - meaning the only version I can use is 10 years out of date and there appears no chance they will all update. (if this was critical I'd update it myself, but there are always more important things to work on so I never will in practice)
Sometimes a package will change license and you need to check the legalese before you update.
Sometimes a package is hijacked (see xv) and so you really should be doing an audit of every update you apply.
I agree with all of the problems that you're highlighting, but would say that all of those problems exist whether or you're doing manual dependency management or using a package manager.
The solution IMO (which is non-existent afaik) would be to integrate some kind of third party auditing service into package managers. For example, for your npm project you could add something like this to your package.json:
` "requireAuditors": [ { "name": "microsoft-scanning-service", "url": "https://npmscanner.microsoft.com/scanner/", "api_key": "yourkeyhere, default to getting it from .env" } ] `
And when you npm install, the version / hash is posted to all of your required auditor's urls. npm should refuse to install any version that hasn't been audited. You can have multiple auditing services defined, maybe some of them paid/able to scan your own internal packages, etc.
I've thought about building a PoC of this myself a couple of times because it's very much on my mind, but haven't spent any time on it and am not really positioned to advocate for such a service.
1 reply →
Yeah, yarn and co came about because npm was slow, buggy and didn't honor its own lockfile.
Nowadays it is mostly improved, and the others differentiate by enchantments to workspaces (better monorepo support) or more aggressive caching by playing games with where the installed packages physically exist on the system.
The core functionality- what a package is- has always been the same across the package managers though, because the runtime behavior is defined by node, not the package manager.
> So what does the solution look like?
There are no solutions, only trade-offs. And the point is that not everything needs to be, nor ought to be, automated. And package managers are a good point of this.
And yes, a language with an ill-defined concept of a package in the language itself is a problem of the language, but the package managers are not making it any better.
> And yes, a language with an ill-defined concept of a package in the language itself is a problem of the language, but the package managers are not making it any better.
If a language does not provide a definition of a package but a package manager _does_, then I would say that that package manager did make that aspect of the problem better.
2 replies →
"I mean, sure. So what does the solution look like? From my perspective it looks like a tool that is able to update your dependencies so that you can easily pick up bug fixes in your dependencies, which sounds an awful lot like a package manager."
Exactly! Who has the time or the discipline to do that manually?
> I mean, sure. So what does the solution look like?
Obviously taking on fewer such liabilities?
The article specifically excludes this as _the_ solution to the problem:
> I am not advocating to write things from scratch.
and is clear in its target:
> That’s my general criticism: the unnecessary automation.
Yes, fewer dependencies is a solution, but it does not seem to be the author's position.
1 reply →
The post goes on to say that random packages are not necessarily better than what members of your team could make. At the end it gets to:
> Through manual dependency management. Regardless of the language, it is a very good idea that you know what you are depending on in your project. Copying and vendoring each package manually, and fixing the specific versions down is the most practical approach to keeping a code-base stable, reliable, and maintainable. Automated systems such as generic package managers hide the complexity and complications in a project which are much better not hidden away.
So that makes all of us human package managers. It's also true that you can get a package manager from internet folk that works better than the processes and utilities your team cobbles together to ease the burden.
Yes. That's the entire point, but it should not be automated which is my point.
> SDL3 might fix it all but the time to integrate SDL3 would be the same time I could write it from scratch.
The Programmers’ Credo: we do these things not because they are easy, but because we thought they were going to be easy
We've effectively written all of this already because of the amount of fixes we've had to do to the SDL2 code. So yes, we know what we are doing.
I’ve had major Nissan Altima effect with this lately. A few weeks ago I set out to make a simple C and C++ package manager that just ignores dependency hell in favor of you explicitly specifying packages. And no binaries, just build from source and use Git as a backend for managing what source maps to what builds.
Plus Lua for package recipes. It’s going really well!
>just ignores dependency hell in favor of you explicitly specifying packages
Isn't that basically manual dependency hell?
I don't find the argument very convincing. If anything you should have more time to carefully vet your dependencies when you don't need to spend time manually doing the tedious bit. Making things more difficult than they need to be just to introduce friction is a ridiculous proposition.
I don't know if this is just coincidental but this was submitted at the same time as a large NPM malware sitution was exposed. https://news.ycombinator.com/item?id=45169657
Completely coincidental. I wrote the article before the situation, AND the article is a transcription of a video recorded in July.
Package managers are constraint solvers. You could manually figure out if XYZ shared library works with somebody else's code, or, you could expect that they would label the range of shared library versions their code needs.
Note to self... don't use Odin.
I remember when a project failed on Kickstarter due to golang dependency hell in particular.
https://news.ycombinator.com/item?id=5796597
Where's the discussion about package sets? Always comparing yourself to Javascript, Java, and friends is obviously easy
There are three points of prioritization here: you can use other peoples' code, manually vet all the code you're running, or accept that you need to trust a social network to vet stuff for you. Pick two. This is not a solvable problem.
EDIT: I've been rate limited, so the point is: unless you're Terry Davis, you're not going to be able to write software of any real complexity. Few people are going to even bother to vet the standard library, let alone the compiler, the runtime, etc etc.
if only people knew how easy it is to implement the standard library and make it way simpler than what is usually provided, everyone would be writing their own standard libraries; you can implement one with string manipulation, files, memory management, threading, and basic timing, in less than 1000loc of c code, as i have done before, and the biggest parts by far were console printing and filesystem stuff, and it's mostly because of windows utf-16 conversion nonsense
Use the first two, and not rely on the third at all. That's what the article is saying.
i had this idea for vendor based “package manager”
what if packages were meant to be read, and config was set inside the file directly
what if we transitioned to think of packages as templates, rather than generic black boxes
i think it would drastically reduce dependencies, package complexity, and improve understanding
You can do this in Python, today. Whenever you need a package, just add its code in a directory under your project's root. And good luck.
how do you update the package?
Honestly, he's not wrong. I use Ruby and 99% of the gems on rubygems.org are absolute trash. I use Rails and stuff like Nokogiri or Faraday, also RubyLLM, but little else because of reasons.
NPM is even worse, you import one thing and get 1000s of trash libraries so nowadays the only JS I write is vanilla and I import ES Modules manually.
Also, Odin doesn't make adding dependencies that difficult, you can literally just throw an Odin library into your project as a folder and it's available. The Odin compiler does everything else for you.
is sdl2 really that bad?
It's impossible to know what issues they have, since they don't specify.
But no, for the vast majority of people, SDL2 is perfectly fine, although SDL3 is a vast improvement. It's as stable and battle-tested as a cross platform multimedia library is bound to get. Opening a window and polling input is trivial.
Then again I've never even heard of the language they're using (Odin) so maybe that doesn't play well with a C library.
The bugs have nothing to do with the language and exist in C too.
I'll just link to some of the bugs directly that posted as issues to SDL:
https://github.com/libsdl-org/SDL/issues/4789 (not fixed) https://github.com/libsdl-org/SDL/issues/4816 (closed) https://github.com/libsdl-org/SDL/issues/4790 (closed)
And these being the bugs we found ourselves, not other bugs that have already been found, and many marked "as not planned" since SDL2 is now finished.