Comment by smithcoin

20 days ago

> This is a five-alarm fire if you're a SWE and not retiring in the next couple years.

I’m sorry, but this is such a hype beast take. In my opinion this is equivalent to telling people not to learn to drive five years ago because of self driving from Tesla. How is that going?

Every single line of code produced is a liability. This idea that you’re going to have “gas town” like agents running and building apps without humans in the loop at any point to generate liability free revenue is insane to me.

Are humans infallible? Obviously not. But if you are telling me that ‘magic probability machines’ are creating safe, secure, and compliant software that has no need for engineers to participate in the output- first I’d like to see a citation and second I have a bridge to sell you.

> In my opinion this is equivalent to telling people not to learn to drive five years ago because of self driving

Self-driving has different economics. We're reading tea leaves, true, but it's also true that software has zero marginal cost and that $20K pays for an engineer-month in SF.

> Every single line of code produced is a liability.

Do you have a hard spec and rock-solid test cases? If you do, you have two options to a working prototype: 2-6 engineer-years, or $20K. The second option will greatly increase in quality and likely decrease in price over the next few years.

What if the spec and the test cases are the new software? Assembly programmers used to make an argument against compiled code that's somewhat parallel to yours: every instruction is a (performance) liability.

> without humans in the loop

There will be humans, just fewer and fewer. The spec and test cases are AI-eligible too.

> safe, secure, and compliant software

I'm not sure humans' advantage here is safe, if it even exists still.

  • So let’s say you fund a single engineer for an open‑source project with $20k. The outcome will be a prototype with some interesting ideas. And yes, with a few hundred bucks' worth of AI assistance that single engineer might get much further than without (but not using any of the techniques presented in this blog). People can coalesce around the project as contributors. A seed was planted and watered a bit.

    In this case, the $20k has been burned and produced zero value. Just look at the repo issues: looks like someone trying to get attention by spamming the issue tracker and opening hundreds of PRs. As an open source project, it’s a dead end.

    So it doesn’t matter that this is “likely decrease in price over the next few years”? The value is zero, so even if superintelligence can produce this in an instant at zero cost in six months, the outcome is still worth zero.

    You’re assuming a kind of inverse relationship between production cost and value.

    In terms of quality, to anyone using those coding agents, it should be clear by now that letting them run autonomously and in parallel is a bad idea. That’s not going to change unless you believe LLMs will turn into something entirely different over time.

    Note that what works with humans—social interaction creating some emergent properties like innovation—doesn’t translate to LLM agents for a simple reason: they don’t have agency, shared goals, or accountability, so the social dynamics that generate innovation can’t form.

    • I agree that there's not a lot of value in your example, but it's the wrong example. AI writing code and humans refining it and maintaining it is probably an inferior proposition, more so if the project is FOSS.

      The model I'm referring to is: "if it walks like software and quacks like software, it's software." Its writers and maintainers are AI. It has a commercial purpose. Its value comes from fulfilling its requirements.

      There will be human handlers, including some who will occasionally have to dig through the dung and fix AI-idiosyncratic bugs. Fewer Ferrari designers, more Cuban 1956 Buick mechanics. It's an ugly approach, but the conjecture that, economically _or_ technically, there must be something fundamentally broken with it is very hand-wavy and dubious.

      I agree that there will be less code-level innovation overall, just like artistic value production took a big hit when we went from portraits to photographs.

      1 reply →