Comment by TeMPOraL

3 days ago

That's presuming libraries need companies backing them to continue to work. That's a bad state of things in the first place.

That's presuming humans don't need money to feed themselves and continue to work.

  • That's neither here, nor there.

    AI is destined to destroy software industry, but not itself.

    Software does not decay by itself (it's literally the whole point of using digital media over analog). Libraries do not "degrade". "Bit rot" is an illusion, a fictitious force like centrifugal force in Newtonian dynamics, representing changes that happen not to a program, but to everything else around it.

    The current degree of churn in webshit ecosystem (whose anti-patterns are increasingly seeping in and infecting other software ecosystems) is not a natural state of things. Reducing churn won't kill existing software - on the contrary, it'll just let it continue to work without changes.

    • You’re mostly right, libraries thrive by adapting to their surroundings. Mostly.

      But after just months of being unmaintained, even the best libraries start to rot away due to bugs and vulnerabilities going unfixed. Users, AI included, will start applying workarounds and mitigations, and the rot spreads to the applications (or libraries) they maintain.

      Unmaintained software is entropy, and entropy is infectious. Eventually, entire ecosystems will succumb to it, even if some life forms continue living in the hazardous wasteland.

    • I struggle to fully grasp everything you postulate. Please help me understand.

      Your original point was that libraries do not need companies behind them. From what you have written here a reason for that is that (web) libraries mostly create churn by introducing constant changes. What I think you follow from that, is that those libraries aren't necessary and that "freezing" everything would not do any harm to the state of web development but would do good by decreasing churn of constantly updating to the newest state.

      What I struggle to understand is (1) how does AI fit into this? And (2) Why do you think there is so much development happening in that space creating all the churn you mention? At this point in time all of this development is still mostly created by humans which are likely paid for what they do. Who pays them and why?

    • “Bit rot is a myth” is junior dev bro pedantry.

      Bit rot isn’t some mystical decay, it’s dependency drift: APIs change, platforms evolve, security assumptions expire, build chains break. Software survives because people continuously adapt it to a moving substrate.

      Reducing churn is good. Pretending maintenance disappears is fantasy. Software doesn’t decay in isolation, it decays relative to everything it depends on. And it sounds like you don’t know anything about Newtonian dynamics either.

    • Until you want a feature that the software doesn't have.

      I hear you about damning shitty code which the web industry as a whole is quite responsible for, but I don't see how them dying outright is better.