Comment by codemonkey-zeta

2 months ago

I'm coming to the unfortunate realizattion that supply chain attacks like this are simply baked into the modern JavaScript ecosystem. Vendoring can mitigate your immediate exposure, but does not solve this problem.

These attacks may just be the final push I needed to take server rendering (without js) more seriously. The HTMX folks convinced me that I can get REALLY far without any JavaScript, and my apps will probably be faster and less janky anyway.

Traditional JS is actually among the safest environments ever created. Every day, billions of devices run untrusted JS code, and no other platform has seen sandboxed execution at such scale. And in nearly three decades, there have been very few incidents of large successful attacks on browser engines. That makes the JS engine derived from browsers the perfect tool to build a server side framework out of.

However, processes and practices around NodeJS and npm are in dire need of a security overhaul. leftpad is a cultural problem that needs to be addressed. To start with, snippets don't need to be on npm.

  • Sandboxing doesn't do any good if the malicious code and target data are in the same sandbox, which is the whole point of these supply-chain attacks.

    • But if we think about a release publishing chain like a BSD process separation, why do they have to be?

      Sure, there will be a step/stage that will require access to NPM publish credentials to publish to NPM. But why does this stage need to execute any code except a very small footprint of vetted code? It should just pickup a packaged, signed binary and move it to NPM.

      The compilation/packaging step on the other hand doesn't need publishing rights to NPM. Ideally, it should only get a filesystem with the sources, dependencies and a few shared libraries and /sys or /proc dependencies it may need to function. Why does some dependency downloading need access to your entire filesystem? Maybe it needs some allowed secrets, but eh.

      It's certainly a lot of change into existing pipelines and ideas, and it's certainly possible to poke holes into there if you want things to be easy. But it'd raise the bar quite a bit.

    • I mean, what does do good if your supply chain is attacked?

      This said, less potential vendors supplying packages 'may' reduce exposure, but doesn't remove it.

      Either way, not running the bleeding edge packages unless it's a known security fix seems like a good idea.

      2 replies →

  • > Traditional JS is actually among the safest environments ever created.

    > However, processes and practices around NodeJS and npm are in dire need of a security overhaul. leftpad is a cultural problem that needs to be addressed. To start with, snippets don't need to be on npm.

    Traditional JS is the reason we have all of these problems around NodeJS and npm. It's a lot better than it was, but a lot of JS tooling came up in the time when ES5 and older were the standard, and to call those versions of the language lacking is... charitable. There were tons of things that you simply couldn't count on the language or its standard library to do right, so a culture of hacks and bandaids grew up around it. Browser disparities didn't help either.

    Then people said, "Well, why don't we all share these hacks and bandaids so that we don't have to constantly reinvent the wheel?", and that's sort of how npm got its start. And of course, it was the freewheeling days of the late 00s/early 10s, when you were supposed to "move fast and break things" as a developer, so you didn't have time to really check if any of this was secure or made any sense. The business side wanted the feature and they wanted it now.

    The ultimate solution would be to stop slapping bandaids and hacks on the JS ecosystem by making a better language but no one's got the resolve to do that.

    • Python is the other extreme, with an incredibly heavy weight standard library with a built in function to do just about anything.

      E.g. there is a built in function that takes elements pairwise from a list! That level of minutia being included feels nuts having come from other languages.

  • Javascript doesn't have a standard library, until it does the 170 million[1] weekly downloads of packages like UUID will continue. You can't expect people to re-write everything over and over.

    [1]https://www.npmjs.com/package/uuid

    • That's not the problem. There is a cultural (and partly technical) aversion in JavaScript to large libraries - this is where the issue comes from. So, instead of having something like org.apache.commons in Java or Boost in C++ or Posix in C, larger libraries that curate a bunch of utilities missing from the standard library, you get an uncountable number of small standalone libraries.

      I would bet that you'll find a third party `leftpad` implementation in org.apache.commons or in Spring or in some other collection of utils in Java. The difference isn't the need for 3rd party software to fix gaps in the standard library - it's the preference for hundreds of small dependencies instead of one or two larger ones.

      3 replies →

    • > You can't expect people to re-write everything over and over.

      Call me crazy but I think agentic coding tools may soon make it practical for people to not be bogged down by the tedium of implementing the same basic crap over and over again, without having to resort to third party dependencies.

      I have a little pavucontrol replacement I'm walking Claude Code through. It wanted to use pulsectl but, to see what it could do, I told it no. Write your own bindings to libpulse instead. A few minutes later it had that working. It can definitely write crap like leftpad.

    • You have the DOM and Node APIs. Which I think cover more than C library or Common Lisp library. Adding direct dependencies is done by every project. The issue is the sprawling deps tree of NPM and JS culture.

      > You can't expect people to re-write everything over and over.

      That’s the excuse everyone is giving, then you see thousands of terminal libraries and calendar pickers.

      4 replies →

  • None of those security guarantees matter when you take out the sandbox, which is exactly what server-side JS does.

    The isolated context is gone and a single instance of code talking to an individual client has access to your entire database. It’s a completely different threat model.

  • I think the smallest C library I’ve seen was a single file to include on your project if you want terminal control like curses on windows. A lot of libraries on npm (and cargo) should be gist or a blog post.

    • 15+ years ago used to copy paste utility functions from stackoverflow, now people npm installing packages for a function or two.

  • Interestingly AI should be able to help a lot with desire to load those snippets.

    What I'm wondering if it would help the ecosystem, if you were able to rather load raw snippets into your codebase, and source control as opposed to having them as dependencies.

    So e.g. shadcn component pasting approach.

    For things like leftPad, cli colors and others you would just load raw typescript code from a source, and there you would immediately notice something malicious or during code reviews.

    You would leave actual npm packages to only actual frameworks / larger packages where this doesn't make sense and expect higher scrutiny, multi approvals of releases there.

> I'm coming to the unfortunate realizattion that supply chain attacks like this are simply baked into the modern JavaScript ecosystem.

I see this odd take a lot - the automatic narrowing of the scope of an attack to the single ecosystem it occurred in most recently, without any real technical argument for doing so.

What's especially concerning is I see this take in the security industry: mitigations put in place to target e.g. NPM, but are then completely absent for PyPi or Crates. It's bizarre not only because it leaves those ecosystems wide open, but also because the mitigation measures would be very similar (so it would be a minimal amount of additional effort for a large benefit).

  • Could you say more about what mitigations you’re thinking of?

    I ask because think the directionality is backwards here: I’ve been involved in packaging ecosystem security for the last few years, and I’m generally of the opinion that PyPI has been ahead of the curve on implementing mitigations. Specifically, I think widespread trusted publishing adoption would have made this attack less effective since there would be fewer credentials to steal, but npm only implemented trusted publishing recently[1]. Crates also implemented exactly this kind of self-scoping, self-expiring credential exchange ahead of npm.

    (This isn’t to malign any ecosystem; I think people are also overcorrect in treating this like a uniquely JavaScript-shaped problem.)

    [1]: https://github.blog/changelog/2025-07-31-npm-trusted-publish...

  • Most people have addressed the package registry side of NPM.

    But NPM has a much, much bigger problem on the client side, that makes many of these mitigations almost moot. And that is that `npm install` will upgrade every single package you depend on to its latest version that matches your declared dependency, and in JS land almost everyone uses lax dependency declarations.

    So, an attacker who simply publishes a new patch version of a package they have gained access to will likely poison a good chunk of all of the users of that package in a relatively short amount of time. Even if the projects using this are careful and use `npm ci` instead of `npm install` for their CI builds, it will still easily get developers to download and run the malicious new version.

    Most other ecosystems don't have this unsafe-by-default behavior, so deploying a new malicious version of a previously safe package is not such a major risk as it is in NPM.

    • > in JS land almost everyone uses lax dependency declarations

      They do, BUT.

      Dependency versioning schemes are much more strictly adhered to within JS land than in other ecosystems. PyPi is a mishmash of PEP 440, SemVer, some packages incorrectly using one in the format of the other, & none of the 3 necessarily adhering to the standard they've chosen. Other ecosystems are even worse.

      Also - some ecosystems (PyPi again) are committing far worse offences than lax versioning - versionless dependency declaration. Heavy reliance on requirements.txt without lockfiles where half the time version isn't even specified at all. Astral/Poetry are improving the situation here but things are still bad.

      Maven land is full of plugins with automated pom.xml version templating that has effectively the same effect as lax versioning, but without any strict adherence to any kind of standard like semver.

      Yes, the situation in JS land isn't great, but there are much worse offenders out there.

      10 replies →

    • `npm install` uses a lockfile by default and will not change versions. No, not transitives either. You would have to either manually change `package.json` or call `npm update`.

      You'd have to go out of your way to make your project as bad as you're describing.

      9 replies →

  • I agree other repos deserve a good look for potential mitigations as well (PyPI too, has a history of publishing malicious packages).

    But don't brush off "special status" of NPM here. It is unique in that JS being language of both front-end and back-end, it is much easier for the crooks to sneak in malware that will end up running in visitor's browser and affect them directly. And that makes it a uniquely more attractive target.

    • npm in itself isn't special at all, maybe the userbase is but that's irrelevant because the mitigation is pretty easy and 99.9999% effective, works for every package manager and boils down to:

      1- thoroughly and fully analyze any dependency tree you plan to include 2- immediately freeze all its versions 3- never update without very good reason or without repeating 1 and 2

      in other words: simply be professional, face logical consequences if you aren't. if you think one package manager is "safer" than others because magic reasons odds are you'll find out the hard way sooner or later.

      7 replies →

  • Which mitigations specifically are in npm but not in crates.io?

    As far as I know crates.io has everything that npm has, plus

    - strictly immutable versions[1]

    - fully automated and no human in the loop perpetual yanking

    - no deletions ever

    - a public and append only index

    Go modules go even further and add automatic checksum verification per default and a cryptographic transparency log.

    Contrast this with docker hub for example, where not even npm's basic properties hold.

    So, it is more like

    docker hub ⊂ npm ⊂ crates.io ⊂ Go modules

    [1] Nowadays npm has this arguably too

    • > Go modules go even further and add automatic checksum verification per default

      Cargo lockfiles contain checksums and Cargo has used these for automatic verification since time immemorial, well before Go implemented their current packaging system. In addition, Go doesn't enforce the use of go.sum files, it's just an optional recommendation: https://go.dev/wiki/Modules#should-i-commit-my-gosum-file-as... I'm not aware of any mechanism which would place Go's packaging system at the forefront of mitigation implementations as suggested here.

    • To clarify (a lot of sibling commenters misinterpreted this too so probably my fault - can't edit my comment now):

      I'm not referring to mitigations in public repositories (which you're right, are varied, but that's a separate topic). I'm purely referring to internal mitigations in companies leveraging open-source dependencies in their software products.

      These come in many forms, everything from developer education initiatives to hiring commercial SCA vendors, & many other things in between like custom CI automations. Ultimately, while many of these measures are done broadly for all ecosystems when targeting general dependency vulnerabilities (CVEs from accidental bugs), all of the supply-chain-attack motivated initiatives I've seen companies engage in are single-ecosystem. Which seems wasteful.

  • I mostly agree. But NPM is special, in that the exposure is so much higher. The hypothetical python+htmx web app might have 10s of dependencies (including transitive) whereas your typical Javascript/React will have 1000s. All an attacker needs to do is find one of many packages like TinyColor or Leftpad or whatever and now loads of projects are compromised.

    • Stuff like Babel, React, Svelte, Axios, Redux, Jest… should be self contained and not depend on anything other than being a peer dependency. They are core technological choices that happens early in the project and is hard or impossible to replace afterwards.

      5 replies →

    • > NPM is special, in that the exposure is so much higher.

      NPM is special in the same way as Windows is special when it comes to malware: it's a more lucrative target.

      However, the issue here is that - unlike Windows - targetting NPM alone does not incur significantly less overhead than targetting software registries more broadly. The trade-off between focusing purely on NPM & covering a lot of popular languages isn't high, & imo isn't a worthwhile trade-off.

Until you go get malware

Supply chain attacks happen at every layer where there is package management or a vector onto the machine or into the code.

What NPM should do if they really give a shit is start requiring 2FA to publish. Require a scan prior to publish. Sign the package with hard keys and signature. Verify all packages installed match signatures. Semver matching isn’t enough. CRC checks aren’t enough. This has to be baked into packages and package management.

  • > Until you go get malware

    While technically true, I have yet to see Go projects importing thousands of dependencies. They may certainly exist, but are absolutely not the rule. JS projects, however...

    We have to realize, that while supply chain attacks can happen everywhere, the best mitigations are development culture and solid standard library - looking at you, cargo.

    I am a JS developer by trade and I think that this ecosystem is doomed. I absolutely avoid even installing node on my private machine.

  • Sign the package with hard keys and signature.

    That's really the core issue. Developer-signed packages (npm's current attack model is "Eve doing a man-in-the-middle attack between npm and you," which is not exactly the most common threat here) and a transparent key registry should be minimal kit for any package manager, even though all, or at least practically all, the ecosystems are bereft of that. Hardening API surfaces with additional MFA isn't enough; you have to divorce "API authentication" from "cryptographic authentication" so that compromising one doesn't affect the other.

    • How are users supposed to build and maintain a trust store?

      In a hypothetical scenario where npm supports signed packages, let's say the user is in the middle of installing the latest signed left-pad. Suddenly, npm prints a warning that says the identity used to sign the package is not in the user's local database of trusted identities.

      What exactly is the user supposed to do in response to this warning?

      2 replies →

  • > What NPM should do if they really give a shit is start requiring 2FA to publish.

    How does 2FA prevent malware? Anyone can get a phone number to receive a text or add an authenticator to their phone.

    I would argue a subscrption model for 1 EUR/month would be better. The money received could pay for certification of packages and the credit card on file can leverage the security of the payments system.

  • How will multi-factor-authentication prevent such a supply chain issue?

    That is, if some attacker create some dummy trivial but convenient package and 2 years latter half the package hub depends on it somehow, the attacker will just use its legit credential to pown everyone and its dog. This is not even about stilling credentials. It’s a cultural issue with bare blind trust to use blank check without even any expiry date.

    https://en.wikipedia.org/wiki/Trust,_but_verify

    • That's an entirely different issue compared to what we're seeing here. If an attacker rug-pulls of course there is nothing that can be done about that other than security scanning. Arguably some kind of package security scanning is a core-service that a lot of organisations would not think twice about paying npm for.

      1 reply →

  • If NPM really cared, they'd stop recommending people use their poorly designed version control system that relies on late-fetching third-party components required by the build step, and they'd advise people to pick a reliable and robust VCS like Git for tracking/storing/retrieving source code objects and stick to that. This will never happen.

    NPM has also been sending out nag emails for the last 2+ years about 2FA. If anything, that constituted an assist in the attack on the Junon account that we saw a couple weeks ago.

    • NPM lock files seem to include hashes for integrity checking, so as long as you check the lock file into the VCS, what's the difference?

      1 reply →

  • NPM does require 2FA to publish. I would love a workaround! Isn't it funny that even here on HN, misinformation is constantly being spread?

    • > The malware includes a self-propagation mechanism through the NpmModule.updatePackage function. This function queries the NPM registry API to fetch up to 20 packages owned by the maintainer, then force-publishes patches to these packages.

    • npm offers 2FA but it doesn't really advertise that it has a phishing-resistant 2FA (security keys, aka passkeys, aka WebAuthn) available and just happily lets you go ahead and use a very phishable OTP if you want. I place much of the blame for publishers getting phished on npm.

They are. Any language that depends heavily on package managers and lacks a standard lib is vulnerable to this.

At some point people need to realize and go back to writing vanilla js, which will be very hard.

The rust ecosystem is also the same. Too much dependence on packages.

An example of doing it right is golang.

  • The solution is not to go back to vanilla JS, it's for people to form a foundation and build a more complete utilities library for JS that doesn't have 1000 different dependencies, and can be trusted. Something like Boost for C++, or Apache Commons for Java.

    • > Something like Boost for C++, or Apache Commons for Java.

      Honestly I wish Python worked this way too. The reason people use Requests so much is because urllib is so painful. Changes to a first-party standard library have to be very conservative, which ends up leaving stuff in place that nobody wants to use any more because they have higher standards now. It'd be better to keep the standard library to a minimum needed more or less just to make the REPL work, and have all of that be "builtin" the way that `sys` is; then have the rest available from the developers (including a default "full-fat" distribution), but in a few separately-obtainable pieces and independently versioned from the interpreter.

      And possibly maintained by a third party like Boost, yeah. I don't know how important that is or isn't.

  • Python and Rust both have decent std lib, but it is just a matter of time before this happens in thoae ecosystems. There is nothing unique about this specific attack that could only happen in JavaScript.

  • >and go back to writing vanilla js

    Lists of things that won't happen. Companies are filled with node_modules importers these days.

    Even worse, now you have to check for security flaws in that JS that's been written by node_modules importers.

    That or there could someone could write a standard library for JS?

  • Some of us are fortunate to have never left vanilla JS.

    Of course that limits my job search options, but I can't feel comfortable signing off on any project that includes more dependencies than I can count at a glance.

Is the difference between the number of dev dependencies for eg. VueJs (a JavaScript library for marshalling Json Ajax responses into UI) and Htmx (a JavaScript library for marshalling html Ajax responses into UI) meaningful?

There is a difference, but it's not an order of magnitude and neither is a true island.

Granted, deciding not to use JS on the server is reasonable in the context of this article, but for the client htmx is as much a js lib with (dev) dependencies as any other.

https://github.com/bigskysoftware/htmx/blob/master/package.j...

https://github.com/vuejs/core/blob/main/package.json

  • Except that htmx's recommended usage is as a single <script> injected directly into your HTML page, not as an npm dependency. So unless you are an htmx contributor you are not going to be installing the dev dependencies.

AFAICT, the only thing this attack relies on, is the lack of scrutiny by developers when adding new dependencies.

Unless this lack of scrutiny is exclusive to JavaScript ecosystem, then this attack could just as well have happened in Rust or Golang.

  • I don't know Go, but Rust absolutely has the same problem, yes. So does Python. NPM is being discussed here, because it is the topic of the article, but the issue is the ease with which you can pull in unvetted dependencies.

    Languages without package managers have a lot more friction to pull in dependencies. You usually rely on the operating system and its package-manager-humans to provide your dependencies; or on primitive OSes like Windows or macOS, you package the dependencies with your application, which involves integrating them into your build and distribution systems. Both of those involve a lot of manual, human effort, which reduces the total number of dependencies (attack points), and makes supply-chain issues like this more likely to be noticed.

    The language package managers make it trivial to pull in dozens or hundreds of dependencies, straight from some random source code repository. Your dependencies can add their own dependencies, without you ever knowing. When you have dozens or hundreds of unvetted dependencies, it becomes trivial for an attacker to inject code they control into just one of those dependencies, and then it's game over for every project that includes that one dependency anywhere in their chain.

    It's not impossible to do that in the OS-provided or self-managed dependency scenario, but it's much more difficult and will have a much narrower impact.

    • If you try installing npm itself on debian, you would think you are downloading some desktop environment. So many little packages.

  • There is little point in you scrutinizing new dependencies.

    Many who claim to fully analyze all dependencies are probably lying. I did not see anyone in the comments sharing their actual dependency count.

    Even if you depend only on Jest - Meta's popular test runner - you add 300 packages.

    Unless your setup is truly minimalistic, you probably have hundreds of dependencies already, which makes obsessing over some more rather pointless.

  • At least in the JS world there are more people (often also more inexperienced people) who will add a dependency willy-nilly. This is due to many people starting out with JS these days.

  • JavaScript does have some pretty insane dependency trees. Most other languages don’t have anywhere near that level of nestedness.

    • Don't they?

      I just went to crates.io and picked a random newly updated crate, which happened to be pixelfix, which fixes transparent pixels in pngs.

      It has six dependencies and hundreds of transient dependencies, may of which appear to be small and highly specific a la left-pad.

      https://crates.io/crates/pixelfix/0.1.1/dependencies

      Maybe this package isn't representative, but it feels pretty identical to the JS ecosystem.

      10 replies →

    • This makes little sense. Any popular language with a lax package management culture will have the exact same issue, this has nothing to do with JS itself. I'm actually doing JS quasi exclusively these days, but with a completely different tool chain, and feel totally unconcerned by any of these bi-weekly NPM scandals.

The blast radius is made far worse by npm having the concept of "postinstall" which allows any package the ability to run a command on the host system after it was installed.

This works for deps of deps as well, so anything in your node_modules has access to this hook.

It's a terrible idea and something that ought to be removed or replaced by something much safer.

  • I agree in principle, but child_process is a thing so I don't think it makes much difference. You are pwned either way if the package can ever execute code.

Simply avoiding Javascript won't cut it.

While npm is a huge and easy target, the general problem exists for all package repositories. Hopefully a supply chain attack mitigation strategy can be better than hoping attackers target package repositories you aren't using.

While there's a culture prevalent in Javascript development to ignore the costs of piling abstractions on top of abstractions, you don't have to buy into it. Probably the easiest thing to do is count transitive dependencies.

Javascript is badly over-used and over-depended on. So many websites just display text and images, but have extremely heavy javascript libraries because that's what people know and that is part of the default, and because it enables all the tracking that powers the modern web. There's no benefit to the user, and we'd be better off without these sites existing if there were really no other choice but to use javascript.

  • NPM does seem vastly over represented in these type of compromises, but I don't necessarily think that e.g. pypi is much better in terms of security. So you could very well be correct that NPM is just a nicer, perhaps bigger, target.

    If you can sneak malware into a JavaScript application that runs in millions of browsers, that's a lot more useful that getting a some number servers running a module as part of a script, who's environment is a bit unknown.

    Javascript really could do with a standard library.

  • > So many websites just display text and images

    Eh... This over-generalises a bit. That can be said of anything really, including native desktop applications.

    • Is that true? The things people use native desktop applications for nowadays tend to be exactly those which aren't just neat content displays. Spreadsheets, terminals, text-editors, CAD software, compilers, video games, photo-editing software. The only things I can think of that I use as just text/image displays are the file-explorer and image/media-viewer apps, of which there are really only a handful on any given OS.

      1 reply →

Rendering template partials server-side and fetching/loading content updates with HTMX in the browser seems like the best of all worlds at this point.

> These attacks may just be the final push I needed to take server rendering (without js) more seriously

Have fun, seems like a misguided reason to do that though.

A. A package hosted somewhere using a language was compromised!

B. I am not going to program in the language anymore!

I don't see how B follows A.

Why is this inevitable? If you use only easily verifyable packages you’ve lost nothing. The whole concept of npm automatically executing postinstall scripts was fixed when my pnpm started asking me every time a new package wanted to do that.

HTMX is full of JavaScript. Server-side-rendering without JavaScript is just back to the stuff Perl and PHP give you.

  • I don't think the point is to avoid Javascript, but to avoid depending on a random number of third-parties.

    > Server-side-rendering without JavaScript is just back to the stuff Perl and PHP give you.

    As well as Ruby, Python, Go, etc.

> The HTMX folks convinced me that I can get REALLY far without any JavaScript

HTMX is JavaScript.

Unless you meant your own JavaScript.

  • When we say 'htmx allows us to avoid JavaScript', we mean two things: (1) we typically don't need to rely on the npm ecosystem, because we need very few (if any) third-party JavaScript libraries; and (2) htmx and HTML-first allow us to avoid writing a lot of custom JavaScript that we would have otherwise written.

This is going to become an issue for a lot of managers, not just npm. Npm is clearly a very viable target right now, though. They're going to get more and more sophisticated.

Took that route myself and I don't regret it. Now I can at least entirely avoid Node.js ecosystem.

> supply chain attacks

You all really need to stop using this term when it comes to OSS. Supply chain implies a relationship, none of these companies or developers have a relationship with the creators other than including their packages.

Call it something like "free code attacks" or "hobbyist code attacks."

  • “code I picked up off the side of the road”

    “code I somehow took a dependency on when copying bits of someone’s package.json file”

    “code which showed up in my lock file and I still don’t know how it got there”

  • A supply chain can have hobbyists, there's no particular definition that says everyone involved must be a professional registered business.