Comment by bwfan123

6 days ago

agree - it is written like clickbait or worse like a sponsored piece.

> But “hallucination” is the first thing developers bring up when someone suggests using LLMs, despite it being (more or less) a solved problem.

really ? what is the author smoking to consider it a solved problem ? This statement alone invalidates the entire article in its casual irreverence for the truth.

I use copilot everyday, and I know where it shines. Please dont try to sell it to me with false advertising.

The article specifically says it's not talking about copilot, but talking about agents that verify the code compiles before they show it to you.

If it uses a function, then you can be sure that function is real.

Was this not clear? The explanation I'm paraphrasing is right in between the line Aurornis quoted and the line you quoted. Except for the crack at copilot that's up at the top.