← Back to context

Comment by LordDragonfang

15 hours ago

This type of response is just stochastic parrotry, rather than displaying evidence of actual <whatever cognitive trait we're overconfidently insisting LLMs don't have>.

Yet more evidence that LLMs are more similar to humans than we give them credit for.

Never stops fascinating me how folks are arguing this kind of thing. Why make up an explanation for why this obvious mistake is actually some kind of elaborate 4D chess sarcastic "intention"? It's a simple machine, its network just didn't support making up a new Toy Story character. That's it! Simple as that! Occam's Razor anybody?

Or yes, maybe the regex I wrote the other day which also had a bug that missed replacing certain parts also had an "intention". It just wanted to demonstrate how fallible I am as a human, so it played this elaborate prank on me. /s