Comment by cedws

2 days ago

This is the security shortcuts of the past 50 years coming back to bite us. Software has historically been a world where we all just trust each other. I think that’s coming to an end very soon. We need sandboxing for sure, but it’s much bigger than that. Entire security models need to be rethought.

This assumes that we can get a locked down, secure, stable bedrock system and sandbox that basically never changes except for tiny security updates that can be carefully inspected by many independent parties.

Which sounds great, but the way things work now tend to be the exact opposite of that, so there will be no trustable platform to run the untrusted code in. If the sandbox, or the operating system the sandbox runs in, will get breaking changes and force everyone to always be on a recent release (or worse, track main branch) then that will still be a huge supply chain risk in itself.

  • The secure boot "shim" is a project like this. Perhaps we need more core projects that can be simple and small enough to reach a "finished" state where they are unlikely to need future upgrades for any reason. Formal verification could help with this ... maybe.

    https://wiki.debian.org/SecureBoot#Shim

  •   > This assumes that we can get a locked down, secure, stable bedrock system and sandbox that basically never changes except for tiny security updates that can be carefully inspected by many independent parties.
    

    For the most part you can. Just version pin slightly-stale versions of dependencies, after ensuring there are no known exploits for that version. Avoid the latest updates whenever possible. And keep aware of security updates, and affected versions.

    Don't just update every time the dependency project updates. Update specifically for security issues, new features, and specific performance benefits. And even then avoid the latest version when possible.

    • Sure, and that is basically what sane people do now, but that only works until something needs a security patch that was not provided for the old version, and changing one dependency is likely to cascade so now I am open to supply chain attacks in many dependencies again (even if briefly).

      To really run code without trust would need something more like a microkernel that is the only thing in my system I have to trust, and everything running on top of that is forced to behave and isolated from everything else. Ideally a kernel so small and popular and rarely modified that it can be well tested and trusted.

      2 replies →

  • >Which sounds great, but the way things work now tend to be the exact opposite of that, so there will be no trustable platform to run the untrusted code in.

    This is the problem with software progressivism. Some things really should just be what they are, you fix bugs and security issues and you don't constantly add features. Instead everyone is trying to make everything have every feature. Constantly fiddling around in the guts of stuff and constantly adding new bugs and security problems.

  • > This assumes that we can get a locked down, secure, stable bedrock system and sandbox that basically never changes except for tiny security updates that can be carefully inspected by many independent parties.

    Not really. You should limit the attack surface for third-party code.

    A linter running in `dir1` should not access anything outside `dir1`.

The NIH syndrome becoming best practice (a commenter below already says they "vibe-coded replacements for many dependencies") would also save quite a few jobs, I suspect. Fun times.

  • I've been doing that too. The downside is it's a lot of work for big replacements.

I've been thinking the same thing. And it's somewhat parallel to what happened to meditation vs. drugs. In the old world the dangerous insights required so many years of discipline that you could sort of trust that the person getting the insight would be ok. But then any idiot can get the insight by just eating some shrooms and oops, that's a problem. Mostly self-harm problem in that case. But the dynamic is somewhat similar to what's happening now with LLMs and coding.

Software people could (mostly) trust each other's OSS contributions because we could trust the discipline it took in the first place. Not any more.

  • In the old world the dangerous insights required so many years of discipline that you could sort of trust that the person getting the insight would be ok. But then any idiot can get the insight by just eating some shrooms and oops, that's a problem.

    I would think humans have been using psychedelics since before we figured out meditation. Likely even before we were humans.

    • Ah yes the stoned ape hypothesis. I don't know if there is or will ever be evidence to support the hypothesis.

      I also like the drunk monkey hypothesis.

What we need is accountability and ties to real-world identity.

If you're compromised, you're burned forever in the ledger. It's the only way a trust model can work.

The threat of being forever tainted is enough to make people more cautious, and attackers will have no way to pull off attacks unless they steal identities of powerful nodes.

Like, it shouldn't be a thing that some large open-source project has some 4th layer nested dependency made by some anonymous developer with 10 stars on Github.

If instead, the dependency chain had to be tied to real verified actors, you know there's something at stake for them to be malicious. It makes attacks much less likely. There's repercussions, reputation damage, etc.

  • > The threat of being forever tainted is enough to make people more cautious

    No it's not. The blame game was very popular in the Eastern Block and it resulted in a stagnant society where lots of things went wrong anyway. For instance, Chernobyl.

  • > What we need is accountability and ties to real-world identity.

    Who's gonna enforce that?

    > If you're compromised, you're burned forever in the ledger.

    Guess we can't use XZ utils anymore cause Lasse Collin got pwned.

    Also can't use Chalk, debug, ansi-styles, strip-ansi, supports-color, color-convert and others due to Josh Junon also ending up a victim.

    Same with ua-parser-js and Faisal Salman.

    Same with event-stream and Dominic Tarr.

    Same with the 2018 ESLint hack.

    Same with everyone affected by Shai-Hulud.

    Hell, at that point some might go out of their way to get people they don't like burned.

    At the same time, I think that stopping reliance on package managers that move fast and break things and instead making OS maintainers review every package and include them in distros would make more sense. Of course, that might also be absolutely insane (that's how you get an ecosystem that's from 2 months to 2 years behind the upstream packages) and take 10x more work, but with all of these compromises, I'd probably take that and old packages with security patches, instead of pulling random shit with npm or pip or whatever.

    Though having some sort of a ledger of bad actors (instead of people who just fuck up) might also be nice, if a bit impossible to create - because in the current day world that's potentially every person that you don't know and can't validate is actually sending you patches (instead of someone impersonating them), or anyone with motivations that aren't clear to you, especially in the case of various "helpful" Jia Tans.

  • Accountability is on the people using a billion third party dependencies, you need to take responsibility for every line of code you use in your project.

    • If you are really talking about dependencies, I’m not sure you’ve really thought this all the way through. Are you inspecting every line of the Python interpreter and its dependencies before running? Are you reading the compiler that built the Python interpreter?

      1 reply →

  • > real-world identity

    This bit sounds like dystopian governance, antithetical to most open source philosophies.

    • Would you drive on bridges or ride in elevators "inspected" by anons? Why are our standards for digital infrastructure and software "engineering" so low?

      I don't blame the anons but the people blindly pulling in anon dependencies. The anons don't owe us anything.

      3 replies →

  • There is no need of that bullshit. Guix can just set an isolated container in seconds not touching your $HOME at all and importing all the Python/NPM/Whatever dependencies in the spot.