Comment by baxuz

6 months ago

The thing is that the data from actual research doesn't support your anecdotal proof of quality:

- https://metr.org/blog/2025-07-10-early-2025-ai-experienced-o...

- https://www.theregister.com/2025/06/29/ai_agents_fail_a_lot/

But more importantly, it makes you stupid:

- https://www.404media.co/microsoft-study-finds-ai-makes-human...

- https://archive.is/M3lCG

And it's an unsustainable bubble and wishful thinking, much like crypto:

- https://dmitriid.com/everything-around-llms-is-still-magical...

So while it may be a fun toy for senior devs that know what to look for, it actually makes them slower and stupider, making them progressively less capable to do their job and apply critical thinking skills.

And as for juniors — they should steer clear from AI tools as they can't assess the quality of the output, they learn nothing, and they also get critical thinking skills impaired.

So with that in mind — Who is the product (LLM coding tools) actually for, and what is its purpose?

I'm not even going into the moral, ethical, legal, social and ecological implications of offloading your critical thinking skills to a mega-corporation, which can only end up like https://youtu.be/LXzJR7K0wK0

All of those studies have been torn apart in detail, often right here on HN.

> So while it may be a fun toy for senior devs that know what to look for, it actually makes them slower and stupider, making them progressively less capable to do their job and apply critical thinking skills.

I've been able to tackle problems that I literally would not have been able to undertake w/o LLMs. LLMs are great at wading through SO posts and GH issue threads and figuring out what magic set of incantations makes some stupid library actually function. They are really good at writing mock classes way faster than I ever have been able to. There is a cost/benefit analysis for undertaking new projects, and if "minor win" involves days of wading through garbage, odds are the work isn't going to happen. But with LLMs I can outsource the drudgery part of the job (throwing crap tons of different parameters at a poorly documented function and seeing what happens), and actually do the part that is valuable (designing software).

You still have to guide the design! Anyone letting LLMs design software is going to fail hard, LLMs still write some wacky stuff. And they are going to destroy juniors, I don't know what the future of the field is going to be like (not pretty that is for sure...)

But I just had an LLM write me a script in ~2 minutes (me describing the problem) that would've taken me 30-60 minutes to write and debug. There would have been no "learning" going on writing a DOS batch script (something I have to do once very 2 or 3 years, so I forget everything I know each time).

  • The AI in OSS study was not “torn apart”.

    The AI aficionados made scary faces at it, tried to scratch it with their cute little claws and then gave up and stopped talking about it. :)

  • > All of those studies have been torn apart in detail, often right here on HN.

    You mean the same Hacker News where everyone was suddenly an expert in epidemiology a few years ago and now can speak with authority to geopolitics?

    • Except we are experts on programming, and on the development and deployment of new technologies.

      "Large group of experts software engineers have informes opinions on software engineering" isn't exactly a controversial headline.

      2 replies →

These studies profoundly miss the mark and were clearly written for engagement/to push a certain view. It's abundantly clear to any developer who has used LLMs that they are a useful tool and have turned the corner in terms of the value they're able to provide vs their limitations.

  • Not to me. I have also not seen any signs that this technology has had macroeconomic effects, and I don't know of any developers in meatspace that are impressed.

    To me it seems like a bunch of religious freaks and psychopaths rolled out a weird cult, in part to plaster over layoffs for tax reasons.

    • > I don't know of any developers in meatspace that are impressed

      I have a theory that there is some anomaly around Bay Area that makes LLMs much better there. Unfortunately the effects seem to be not observable from the outside, it doesn't seem to work on anything open source

- higher editorial standards and gatekeeping meant print media was generally of higher quality than internet publications

- print publications built reputations of spans of time that the internet still hasn't existed for, earning greater trust and authority, and helping to establish shared cultural touchstones and social cohesion

- copyright was clearer and more meaningful, piracy was more difficult

- selling physical copies and subscriptions was a more stable revenue source for creators and publishers than the tumult of selling ads in the 21st century

And all of this was nothing in the face of "receiving pages of text. Faster than one could read"

> Who is the product (LLM coding tools) actually for, and what is its purpose?

Ideally: it's for people who aren't devs, don't want to be devs, can't afford to pay devs to build their hobby projects for them, and just want to have small tools to unblock or do cool stuff. It's pretty incredible what a no-coder can knock off in an evening just by yelling at Cursor. It's a 3D printer for code.

But realistically, we know that the actual answer is: the people who already destroy companies for their own short-term benefit and regard all tech workers as fungible resources will have no problem undermining the feasibility of hiring good senior devs in 2050 in exchange for saving a ton of money now by paying non-devs non-dev money to replace juniors, leaning HARD on the remaining meds/seniors to clean up the resulting mess, and then pulling the ripcord on their golden parachute and fucking off to some yacht or island or their next C-suite grift before the negative consequences hit, all the while touting all the money they saved "automating" the development process at their last corp. And then private equity buys it up, "makes it efficient" to death, and feeds its remaining viable organs to another company in their portfolio.

I think it's worth saying that I basically completely disagree with your assessment (how you read the evidence, your conclusions, and quite possibly your worldview,) and think that if you were to give me access to infinite throughput claude code in 2018 that I could have literally ruled the world.

I'm not the most impressive person on hacker news by a wide margin, but I've built some cool things that were hard, and I think they are absolutely inevitable and frequently have the exact same "one shot" type experience where things just work. I would seriously reconsider whether it is something that you can't make work well for you, or something you don't want to work well.

> So with that in mind — Who is the product (LLM coding tools) actually for, and what is its purpose?

It's for grifters to make more money by getting viral on Twitter and non technical managers that want to get rid of their workforce.

"But more importantly, it makes you stupid:"

I don't think it was your intent, but that reads out as a seriously uncalled for attack - you might want to work on your phrasing. Hacker News rules are pretty clear on civility being an important virtue.

  • I doubt it. It's not directed at an individual, and it's presented as a passive fact. It's a bit like saying "drugs make you stupid", which no-one would complain about.

  • I didn't target the author, and I used the terminology used in the article heading