LLMs are to interns what house cats are to babies. They seem more self sufficient at first, but soon the toddler grows up and you're stuck with an animal who will forever need you to scoop its poops.
Without a mechanism to detect output from LLMs, we’re essentially facing an eternal model collapse with each new ingestion of information from academic journals, to blogs, to art. [1][2]
LLMs are to interns what house cats are to babies. They seem more self sufficient at first, but soon the toddler grows up and you're stuck with an animal who will forever need you to scoop its poops.
And the content online is now written by Fully Automated Eternal September
Today is Friday the 11490th of September 1993.
Without a mechanism to detect output from LLMs, we’re essentially facing an eternal model collapse with each new ingestion of information from academic journals, to blogs, to art. [1][2]
[1] https://en.m.wikipedia.org/wiki/Model_collapse
[2]https://thebullshitmachines.com/lesson-16-the-first-step-fal...