← Back to context

Comment by fzeroracer

6 days ago

> And no, that's not some slight of verbal hand in measuring "productive" -- they are able to ship more value, faster.

Ship more value faster is exactly a verbal slight of hand. That's the statement used by every bad product manager and finance asshole to advocate for shipping out broken code faster. It's more value because more code is more content, but without some form of quality guard rails you run into situations where everything breaks. I've been on teams just like that where suddenly everything collapses and people get mad.

Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?

At the end of the day, coders are being paid money to produce something.

It's not art -- it's a machine that works and does a thing.

We can do that in ways that create a greater or lesser maintenance burden, but it's still functional.

LLM coding tools detractors are manufacturing reasons to avoid using another tool that helps them write code.

They need to get over the misconception of what the job is. As another comment previously quipped 'If you want to write artisanal, hand-tuned assembly that's beautiful, do that on your own time for a hobby project.'

  • > Do you think compilers helped teams ship more value faster from worse developers? IDEs with autocomplete? Linters?

    I'm tired of engaging with this false equivalence so I won't. Deterministic systems are not the same.

    > It's not art -- it's a machine that works and does a thing.

    That's right. But what you need to understand is that the machines we create can and do actively harm people. Leaking secure information, creating software that breaks systems and takes down critical infrastructure. We are engineers first and foremost and artists second. And that means designing systems to be robust and safe. If you can't understand that then you shouldn't be an engineer and should kindly fuck off.

    • As an engineer, appreciate that in the span of a paragraph you got yourself from talking about something technical... to telling someone to fuck off.

      Even more humorously, because you seem to think I'm making an argument that isn't in anything I wrote. (LLM to/in prod)

  • There is a big difference with compilers. With compilers, the developer still needs to write every single line of code. There is a clear an unambiguous contract between the source code and what gets executed (if it's ambiguous, it is a bug).

    The thread here was talking about:

    > Well, if everyone uses a calculator, how do we learn math?

    The question being whether or not AI will make developers worse at understanding what their code is doing. You can say that "it's okay if a website fails every 100 times, the user will just refresh and we're still more profitable". But wouldn't you agree that such a website is objectively of worse quality? It's cheaper, for sure.

    Said differently: would you fly in a plane for which the autopilot was vibe coded? If not, it tells you something about the quality of the code.

    Do we always want better code? I don't know. What I see is that the trend is enshittification: more profit, worse products. I don't want that.

    • > [With compilers] There is a clear an unambiguous contract between the source code and what gets executed

      Debatable in practice. You can't tell me you believe most developers understand what their compiler is doing, to a level of unambiguity.

      Whether something gets unrolled, vectorized, or NOP-padded is mysterious. Hell, even memory management is mysterious in VM-based languages now.

      And yes (to the inevitable follow-up) still deterministic, but those are things that developers used to have to know, now they don't, and the world keeps spinning.

      > You can say that "it's okay if a website fails every 100 times, the user will just refresh and we're still more profitable". But wouldn't you agree that such a website is objectively of worse quality? It's cheaper, for sure.

      I would say that's the reality we've been living in since ~2005. How often SaaS products have bugs? How frequently mobile apps ship a broken feature?

      There are two components here: (1) value/utility & (2) cost/time.

      There are many websites out there that can easily take a 1 in 100 error rate and still be useful.

      But! If such a website, by dint of its shitty design, can be built with 1/100th of the resources (or 100x websites can be built with the same), then that might be a broader win.

      Not every piece of code needs to fly in space or run nuclear reactors. (Some does! And it should always have much higher standards)

      > Said differently: would you fly in a plane for which the autopilot was vibe coded? If not, it tells you something about the quality of the code.

      I flew in a Boeing 737 MAX. To the above, that's a domain that should have called for higher software standards, but based on the incident rate I had no issue doing so.

      > Do we always want better code? I don't know. What I see is that the trend is enshittification: more profit, worse products. I don't want that.

      The ultimate tradeoff is between (expensive/less, better code) and (cheaper/more, worse code).

      If everything takes a minimum amount of cost/effort, then some things will never be built. If that minimum cost/effort decreases, then they can be.

      You and I are of like mind regarding enshittification and declining software/product standards, but I don't think standing in front of the technological advancement train is going to slow it.

      If a thing can be built more cheaply, someone will do it. And then competitors will be forced to cheapen their product as well.

      Imho, the better way to fight enshittification is creating business models that reward quality (and scale).

      2 replies →