← Back to context

Comment by Barbing

14 hours ago

>his whole job

Possibly akin to a roofer taking a shortcut up there, then taking a spill? You knew better but unfortunately let the fact that you could probably get away with it with zero impact decide for you.

IIRC hallucinations were essentially kicked off initially by user error, or rather… let’s say at least: a journalist using the best available technologies should have been able to reduce the chance of this big of an issue to near zero, even with language models in the loop & without human review.

(e.g. imagine Karpathy’s llm-council with extra harnessing/scripting, so even MORE expensive, but still. Or some RegEx!)

Alternatively… there was no AI error, the reporter made up the quotes, and lied when they were challenged.

  • The chance that the very first time AI was used it screwed up and was caught is pretty low.

    It’s likely been used before but nobody got caught.