Comment by kentonv

4 days ago

> 1. Its output is unreliable at best

> 2. That output often looks correct to an untrained eye and requires expert intervention to catch serious mistakes

The thing is this is true of humans too.

I review a lot of human code. I could easily imagine a junior engineer creating CVE-2025-4143. I've seen worse.

Would that bug have happened if I had written the code myself? Not sure, I'd like to think "no", but the point is moot anyway: I would not have personally been the one to write that code by hand. It likely would have gone to someone more junior on the team, and I would have reviewed their code, and I might have forgotten to check for this all the same.

In short, whether it's humans or AI writing the code, it was my job to have reviewed the code carefully, and unfortunately I missed here. That's really entirely on me. (It's particularly frustrating for me as this particular bug was on my list of things to check for and somehow I didn't.)

> 3. The process automates away a task that many people rely on for income

At Cloudflare, at least, we always have 10x more stuff we want to work on then we have engineers to work on it. The number of engineers we can hire is basically dictated by revenue. If each engineer is more productive, though, then we can ship features faster, which hopefully leads to revenue growing faster. Which means we hire more engineers.

I realize this is not going to be true everywhere, but in my particular case, I'm confident saying that my use of AI did not cause any loss of income for human engineers, and likely actually increased it.

I mean, fair. It's true that humans aren't that great at writing code that can't be exploited, and the blogpost makes this point too: between a junior engineer's output and an LLM's output, the LLM does the same thing for cheaper.

I would argue that a junior engineer has a more valuable feature--the ability to ask that junior engineer questions after the fact, and ideally the ability to learn and eventually become a senior engineer--but if you're looking at just the cost of a junior engineer doing junior engineer things...yeah, no, the LLM does it more efficiently. If you assume that the goal is to write code cheaper, LLMs win.

However, I'd like to point out--again--that this isn't going to be used to replace junior engineers, it's going to be used to replace senior engineers. Senior engineers cost more than junior engineers; if you want each engineer to be more productive per-dollar (and assume, like many shareholders do, that software engineers are fungible) then the smart thing to do is replace the more costly engineer. After all, the whole point of AI is to be smart enough to automate things, right?

You and I understand that a senior engineer's job is very different from a junior engineer's job, but a stockholder doesn't--because a stockholder only needs to know how finance works to be a successful stockholder. Furthermore, the stockholder's goal is simply to make as much money as possible per quarter--partly because he can just walk out if the company starts going under, often with a bigger "severance package" than any of the engineers in the company. The incentives are lined up not only for the stockholder to not know why getting rid of senior engineers is a bad idea, but to not care. Were I in your position, I would be worried about losing my job, not because I didn't catch the issue, but because

Aside: Honestly, I don't really blame you for getting caught out by that bug. I'm by no means an expert on anything to do with OAuth, but it looks like the kind of thing that's a nightmare to catch, because it's misbehavior under the kind of conditions that are--well, only seen when maliciously crafted. If it wasn't something that was known about since the RFC, it would probably have taken a lot longer for someone to find it.

  • Luckily, shareholders do not decide who to hire and fire. The actual officers of the company, hopefully, understand why senior engineers are non-fungible. Tech companies, at least, seem to understand this well. (I do think a lot of non-tech companies that nevertheless have software in their business get this wrong, and that's why we see a lot of terrible software out there. E.g. most car companies.)

    As for junior engineers, despite the comparisons in coding skill level, I don't think most people are suggesting that AI should replace junior engineers. There are a lot of things humans do which AI still can't, such as seeing the bigger picture that the code is meant to implement, and also, as you note, learning over time. An LLM's consciousness ends with its context window.

  • Apparently I missed the end of a sentence near the end there. "But because" on the fourth paragraph is supposed to be "but because the sales pitch is that the machine can replace me". Oops.