Comment by slopinthebag

1 day ago

Idk exactly how to articulate my thoughts here, perhaps someone can chime in and help.

This feels like a natural consequence of the direction web development has been going for the last decade, where it's normalised to wire up many third party solutions together rather than building from more stable foundations. So many moving parts, so many potential points of failure, and as this incident has shown, you are only as secure as your weakest link. Putting your business in the hands of a third party AI tool (which is surely vibe-coded) carries risks.

Is this the direction we want to continue in? Is it really necessary? How much more complex do things need to be before we course-correct?

This isn't a web development concept. It's the unix philosophy of "write programs that do one thing and do it well" and interconnect them, being taken to the extremes that were never intended.

We need a different hosting model.

  • Just throwing it out there - the Unix way to write software is often revered. But ideas about how to write software that came from the 1970s at Bell Labs might not be the best ideas for writing software for the modern web.

    Instead of "programs that do one thing and do it well", "write programs which are designed to be used together" and "write programs to handle text streams", I might go with a foundational philosophy like "write programs that are do not trust the user or the admin" because in applications connected to the internet, both groups often make mistakes or are malicious. Also something like "write programs that are strict on which inputs they accept" because a lot of input is malicious.

    • The Unix model wasn't simply do one thing and do it well.

      It was also a different model on ownership and vetting of those focused tools. It might have been a model of having the single source tree of an old UNIX or BSD, where everything was managed as a coherent whole from grep to cc all the way to X11. Or it might have been the Linux distribution model of having dedicated packagers do the vetting to piecemeal packages into more of a bazaar, even going so far as to rip scripting language bundles into their component pieces as for Python and Perl.

      But in both of those models you were put farther away from the third-party authors bringing software into the open-source (and proprietary) supply chains.

      This led to a host of issues with getting new software to users and with a fractal explosion of different versions of software dependencies to potentially have to work around, which is one reason we saw the explosion of NPM and Cargo and the like. Especially once Docker made it easy to go straight from stitching an app together with NPM on your local dev seat to getting it deployed to prod.

      But the issue isn't with focused tooling as much as it is with hewing more closely to the upstream who could potentially be subverted in a supply chain attack.

      After all, it's not as if people never tried to do this with Linux distros (or even the Linux kernel itself -- see for instance https://linux.slashdot.org/story/03/11/06/058249/linux-kerne... ). But the inherent delay and indirection in that model helped make it less of a serious risk.

      But even if you only use 1 NPM package instead of 100, if it's a big enough package you can assume it's going to be a large target for attacks.

    • > Just throwing it out there - the Unix way to write software is often revered. But ideas about how to write software that came from the 1970s at Bell Labs might not be the best ideas for writing software for the modern web.

      GP said it's about taking the Unix philosophy to extremes, you say something different.

      Anything taken to extremes is bad; the key word there is "extremes". There is nothing wrong with the Unix philosophy, as "do one thing and do it well" never meant "thousands of dependencies over which you have no control, pulled in without review or thought".

    • I do not see what this has to do with Unix. The problem is not that programs interoperate or handle text streams, the problem is a) the supply chain issues in modern web-software (and thanks to Rust now system-level) development and b) that web applications do not run under user permissions but work for the user using token-based authentication schemes.

  • > We need a different hosting model.

    There really isn't an option here, IMO.

    1. Somebody does it

    2. You do it

    Much happier doing it myself tbh.

    • There's a lot of wiggle room on how you define "it". At the ends of the spectrum it's obvious, but in the middle it gets a bit sticky.

  • It's not a hosting model, it's a fundamental failure of software design and systems engineering/architecture.

    Imagine if cars were developed like websites, with your brakes depending on a live connection to a 3rd party plugin on a website. Insanity, right? But not for web businesses people depend on for privacy, security, finances, transportation, healthcare, etc.

    When the company's brakes go out today, we all just shrug, watch the car crash, then pick up the pieces and continue like it's normal. I have yet to hear a single CEO issue an ultimatum that the OWASP Top 10 (just an example) will be prevented by X date. Because they don't really care. They'll only lose a few customers and everyone else will shrug and keep using them. If we vote with our dollars, we've voted to let it continue.