Comment by quesera
8 days ago
Right -- we know that LLMs cannot think, feel, or understand.
Therefore whenever they produce output that looks like the result of those things, we must either be deceived by a reasonable facsimile, or we simply misapprehended their necessity in the first place.
But, do we understand the human brain as well as we understand LLMs?
Obviously there's something different, but is it just a matter of degrees? LLMs have greater memory than humans, and lesser ability to correlate it. Correlation is powerful magic. That's pattern matching though, and I don't see a fundamental reason why LLMs won't get better at it. Maybe never as good as (smart) humans are, but with their superior memory, maybe that will often be adequate.
> they produce output that looks like the result of those things
Is a cardboard cutout human to some degree? Is a recording a voice? What about a voice recording in a phone menu?
> LLMs have greater memory than humans,
So does a bank of hard drives by that metric.
(Memory Access + Correlation Skills) is a decent proxy for several of the many kinds of human intelligence.
HDDs don't have correlation skills, but LLMs do. They're just not smart-human-level "good", yet.
I am not sure whether I believe AGI will happen. To be meaningful, it would have to be above the level of a smart human.
Building an army of disincorporated average-human-intelligence actors would be economically "productive" though. This is the future I see us trending toward today.
Most humans are not special. This is dystopian, of course. Not in the "machines raise humans for energy" sort of way, but probably no less socially destructive.
HDDs don't have correlation skills, but LLMs do
So which is it, the memory or the correlation? I'll give you a hint, this is a trick question.
5 replies →