← Back to context

Comment by apwell23

6 days ago

> tell it when it hallucinates, it’ll correct itself

no it doesn't. Are you serious?

just today 3 times and countless times before… you just gotta take some serious time to learn and understand it… or alternatively write snarky comments on the internet…

  • So when LLMs go around in circles, as it often does [1], that's a skill issue. But when it gets it right some of the time, that's proof of superiority.

    This is the kind of reasoning that dominates LLM zealotry. No evidence given for extraordinary claims. Just a barrage of dismissals of legitimate problems. Including the article in discussion.

    All of this makes me have a hard time taking any of it seriously.

    [1]: https://news.ycombinator.com/item?id=44050152

  • intresting. for me it just keeps making up new stuff that doesn't exist when i feed it the error and telling it hallucinates.

    perhaps ppl building crud webapps have different experience than ppl building something niche?