Comment by rudolph9

4 months ago

I can’t help but think a lot of these comments are actually written by AI — and that, in itself, showcases the value of AI. The fact that all of these comments could realistically have been written by AI with what’s available today is mind-blowing.

I use AI on a day-to-day basis, and by my best estimates, I’m doing the work of three to four people as a result of AI — not because I necessarily write code faster, but because I cover more breadth (front end, back end, DevOps, security) and make better engineering decisions with a smaller team. I think the true value of AI, at least in the immediate future, lies in helping us solve common problems faster. Though it’s not yet independently doing much, the most relevant expression I can think of is: “Those who cannot do, teach.” And AI is definitely good at relaying existing knowledge.

What exactly is the utility of AI writing comments that seem indistinguishable from people? What is the economic value of a comment or an article?

At present rate, there is a good argument to be made that the economic value is teetering towards negative

A comment on a post or an article on the internet has value ONLY if there are real people at the other end of the screen reading it and getting influenced by it

But if you flood the internet with AI slop comments and articles, can you be 100% sure that all the current users of your app will stick around?

If there are no people to read your articles, your article has zero economic value

  • Perhaps economic value can come from a more educated and skilled workforce if they're using AI for private tuition (if it can write as well as us, it can provide a bespoke syllabus, feedback etc.)

    Automation over teaching sounds terrible in the long run, but I could see why learning languages and skills could improve productivity. The "issue" might be here that there's more to gain in developing nations with poor education standards, and so while capital concentrates more to the US because they own the tech, geographical differences in labour productivity reduces.

  • What is the economic value of a wheel? If we flood the market with wheels, we’re going to need far fewer sleds and horses. Pretty soon, no one might need horses at all — can you imagine that?

    • No, you flood the roads with so many constantly running robot wheels that no one actually wants to walk or drive on the road anymore because the robot wheels keep bumping into them

      In the process of making better wheels, you’ve made the roads unusable and now no one wants to leave the house - or buy wheels

      2 replies →

    • This is the most nothing-burger response I've ever seen in my life.

      Comments written by fucking humans barely have any value. All their value comes from the fact you can manipulate humans into buying shit - advertising.

      You can't manipulate AI into buying shit because it doesn't have money because it's not a laborer and doesn't have a right to a fair wage.

  • hell, it has negative economic value because of the opportunity costs of the electricity and water used to produce it.

That first sentence is a tautology. The second to last sentence is one of those things it’s ok to think until you learn better, but don’t say that in polite company.

Did AI write all these comments? AI is turning me into a conspiracy theorist? I keep seeing AI is like having a team of 3-4 people, or doing the work of 3-4 people type posts everywhere lately like it's some kind of meme. I don't even know what it means. I don't think you're saying you have 4x'd your productivity? But maybe you are?

  • Best I can tell, it’s resulting in less churn, which isn’t the same as work getting done faster. Maybe it’s a phenomenon unique to engineering, but what I’m observing isn’t necessarily work getting done faster — it’s that a smaller number of people are able to manage a much larger footprint because AI tools have gotten really good at relaying existing knowledge.

    Little things that historically would get me stuck as I switch between database work, front-end, and infrastructure are no longer impeding me, because the AI tools are so good at conveying the existing knowledge of each discipline. So now, with a flat org, things just get done — there’s no need for sprint masters, knowledge-sharing sessions, or waiting on PR reviews. More people means more coordination, which ultimately takes time. In some situations that’s unavoidable, but in software engineering, most of the patterns, tools, and practices are well established; it’s just a matter of using them effectively without making your head explode.

    I think this relay of knowledge is especially evident when I can’t tell an AI comment from a human one in a technical discussion — a kind of modern Turing Test, or Imitation Game.

    • I'm not saying anything that hasn't been said a thousand times before. But I find it's evident when I'm getting it to do something I consider myself good at. And that's what's worrying to me. I work in DevOps and there are a couple of tools I'm really good at. If I were trusting the output all my configuration would be outdated and set up like a blog example with all the issues and shortcuts one takes in them (and I see that in the PRs that I get from the team members that rely on claud heavily). But if you didn't know the tool it would look fine. So when I code with the agent, it all looks really good, but I must be missing things right? For scripts that have no impact if they fail, I llm the shit out of that.