← Back to context

Comment by galaxyLogic

1 month ago

Software products are about unique competitive value that grows over time. Products have it or not. AI produced software is like open source in a sense, you get something for free. But whose gonna get rich if everybody can just duplicate your product by asking AI to do it, again?

Think of investing in the stock market by asking AI to do all the trading, for you. Great maybe you make some money. But when everybody catches on that it is better to let the AI do the trading, then others's AI is gonna buy the same stocks as yours, and their price goes up. Less value for you.

Spot on. That's why so far all of the supposed solutions to 'the programmer problem' have failed.

Whether this time it will be different I don't know. But originally compilers were supposed to kill off the programmers. Then it was 3G and 4G languages (70's, 80's). Then it was 'no code' which eventually became 'low code' because those pesky edge cases kept cropping up. Now it is AI, the 'dark factory' and other fearmongering. I'll believe it when I see it.

Another HN'er has pointed me into an interesting direction that I think is more realistic: AI will become a tool in the toolbox that will allow experts to do what they did before but faster and hopefully better. It will also be the tool that will generate a ton of really, really bad code that people will indeed not look at because they can not afford to look at it: you can generate more work for a person in a few seconds of compute time than you can cover in a lifetime. So you end up with half baked buggy and insecure solutions that do sort of work on the happy path but that also include a ton of stuff that wasn't supposed to be there in the first place but that wasn't explicitly spelled out in the test set (which is a pretty good reflection of my typical interaction with AI).

The whole thing hinges on whether or not that can be fixed. But I'm looking forward to reading someone's vibe coded solution that is in production at some presumably secure installation.

I'm going to bet that 'I blame the AI' is a pattern what we will be seeing a lot of.

  • In the long run, it's going to become about specifications.

    Code is valuable because it tells computers what you want them to do. If that can be done at a higher level, by writing a great specification that lets some AI dark factory somewhere just write the app for you in an hour, then the code is now worthless but the spec is as valuable as the code ever was. You can just recode the entire app any time you want a change! And even if AI deletes itself from existence or whatever, a detailed specification is still worth a lot.

    Whoever figures out how to describe useful software in a way that can get AI agents to reliably rebuild it from human-authored specifications is going to get a lot of attention over the next ~decade.

    • > Whoever figures out how to describe useful software in a way that can get AI agents to reliably rebuild it from human-authored specifications

      Which is why I think there's very little threat to the various tech career paths from AI.

      Humans suck at writing specifications or defining requirements for software. It's always been the most difficult and frustrating part of the process, and always will be. And that's just actually articulating the requirements, to say nothing of the process of even agreeing on the requirements in the first place to even start writing the spec.

      If a business already cannot clearly define what they need to an internal dev team, with experts that can somewhat translate the messy business logic, then they have a total of zero hope to ever do the same but to an unthinking machine and expect any kind of reliable output.

      2 replies →

    • One of the unexpected benefits of everyone scrambling to show that they used AI to do their job is that the value of specs and design documents are dawning on people who previously scoffed at them as busywork. Previously, if I wanted to spend a day writing a detailed document containing a spec and discussion of tradeoffs and motivations, I'd have to hide it from my management. Now, I'm writing it for the AI so it's fine.