Comment by cj

4 hours ago

> Makes me want to just give up programming forever and never use a computer again.

LLMs aren’t the first thing to come along and change how people develop applications.

You had the rise of frameworks like Django, Rails, etc. Also the rise of SPAs. And also the rise of JS as a frontend+backend language.

In a 3-5 yeats we’ll have adapted to the new norm like we have in the past

The difference between writing assembly code and Ruby code is much smaller than the difference between programming and vibe coding.

Also, companies are pressuring employees towards adoption in novel ways. There was no such industry-wide pressure by employers in the 90s, 2000s or 2010s for engineers to use a specific tech.

  • > Also, companies are pressuring employees towards adoption in novel ways. There was no such industry-wide pressure by employers in the 90s, 2000s or 2010s for engineers to use a specific tech.

    Companies have been enforcing technology mandates since time immemorial. In the early 2000s there were definitely a lot of mandates to move away from commercial UNIX to Linux. Lots of companies began enforcing the switch to PHP, Ruby and Python for new projects.

    • Yes, but the entire industry was not pushing any one single tool at the same time. If you disliked Django, you could go to Rails. If you disliked Rails, you had Phoenix. Etc.

      Good luck disliking LLM babysitting these days

Or, it could be like asbestos and the immediate benefits are just too appealing to listen to arguments of skeptical naysayers about some vaguely defined problems that are decades away, if they even happen.

I use AI tools daily (because they feel like they're helping me) but it's not exactly hard to imagine scenarios where an explosion of slop piling up plus harm to learning by outsourcing all thinking results in systemic damage that actually slows the pace of technological progress given enough time.

History of new technologies tend to average into a positive trend over a long enough time scale but that doesn't mean there aren't individual ups and downs. Including WTF moments looking back at what now seems like baffling decision-making with benefit of hindsight.

  • Some of us are already experiencing that. For example I handed off an initial version of something some months ago, and the AI-generated stuff they came up with was a huge buggy mess of spaghetti code neither of us understood. Months later we've detangled it, cutting it down to a third the size, making it far simpler to understand, and fixing several bugs in the process (one was even by accident, we'd made note of it, then later when we went to fix it, it was already fixed).

  • > Or, it could be like asbestos

    If it is, the fall out will be way worse than if AI ends up living up to (reasonable) expectations.

    If it doesn’t, we are going to see over a trillion dollars of capital leave the tech sector, which I think will have worse impacts on the livelihood of tech workers than if AI ends up panning out.

    This is something the naysayers need to grapple with. We’ve crossed a line where this tech needs to work simply because of the amount of money depending on that fact.

    • The asbestos hypothetical is a bit different than the "bubble popping" economic crisis scenario though. In this world, AI would just continue being adopted and shoved into every nook and cranny into which it can be made to fit, with valuations only getting bigger and bigger.

      The damage would come much later, well beyond the point where it could be simply pulled out and replaced without spending massive amounts of money and would also basically necessitate training an entire new generation of engineers.

      Then the AI giants would start appearing vulnerable like cigarette companies in the 90s while an AI Superfund and interstate class action are being planned but Sam Altman would already be a centitrillionaire at that point so it would be someone else's problem.