Comment by idiotsecant
3 months ago
'think' is one of those words that used to mean something but is now hopelessly vague- in discussions like these it becomes a blunt instrument. IMO LLMs don't 'think' at all - they predict what their model is most likely to say based on previously observed patterns. There is no world model or novelty. They are exceptionally useful idea adjacency lookup tools. They compress and organize data in a way that makes it shockingly easy to access, but they only 'think' in the way the Dewey decimal system thinks.
if we were having this conversation in 2023 I would agree with you, but LLM's have advanced so much that they are essentially efficient lookup tables is an oversimplification so dramatic I know you don't understand what you're talking about.
No one accuses the Dewey decimal system of thinking.
If I am so ignorant maybe you'd like to expand on exactly why I'm wrong. It should be easy since the oversimplification is dramatic enough that it made you this aggressive.
No, I don't want to waste my time trying to change the view of someone so close-minded they can't accept that LLM's do anything close to "thinking"
Sorry.
2 replies →