Comment by thaumasiotes
6 days ago
> LLMs are really good with words and kind of crap at “thinking.” Humans are wired to see these two things as tightly connected. A machine that thinks poorly and talks great is inherently confusing. A lot of discussion and disputes around LLMs comes down to this.
It's not an unfamiliar phenomenon in humans. Look at Malcolm Gladwell.
No comments yet
Contribute on Hacker News ↗