Comment by CrackerNews

2 days ago

The case against LLM is thinking could be that "backpropagation is a leaky abstraction." Whether LLM is thinking depends on how well the mathematical model is defined. Ultimately, there appears to be a limit to the mathematical model that caps the LLM capacity to think. It is "thinking" at some level, but is it at enough of a significant level that can be integrated into human society according to the hype?

Andrej Karpathy in his interview with Dwarkesh Patel was blunt about the current limitations of LLMs, and that there would need to be further architectural developments. LLMs lack the capacity to dream and distill experience and knowledge learned back into the neurons. Thinking in LLMs at best exist as a "ghost" only in the moment as long as the temporary context remains coherent.