Comment by buz11
6 months ago
The most useful analogy I've heard is LLMs are to the internet what lossy jpegs are to images. The more you drill in the more compression artifacts you get.
6 months ago
The most useful analogy I've heard is LLMs are to the internet what lossy jpegs are to images. The more you drill in the more compression artifacts you get.
(This is of course also the case for the human brain.)