Comment by LordDragonfang

5 days ago

> It is merely that an LLM is not it, and will never become it.

Okay, I didn't want to put words in your mouth by claiming you said this, but now that you have, I can address it.

You have literally no way of knowing this. You don't understand how cognition actually works, because no one does, and you don't understand how LLMs actually produce a facsimile of intelligence, for very similar reasons. So you can't say that with certainty, and likewise you cannot claim to know what is actually required for cognition (without leaning heavily on human exceptionalism)

Skeptics of LLMs have been claiming that it "cannot possibly X" for the better part of a decade, and time and time again they have been proved wrong. Ironically, I was just reading an article this morning[1] that reiterated this point:

> [W]e’re still having the same debate - whether AI is a “stochastic parrot” that will never be able to go beyond “mere pattern-matching” into the realm of “real understanding”.

> My position has always been that there’s no fundamental difference: you just move from matching shallow patterns to deeper patterns, and when the patterns are as deep as the ones humans can match, we call that “real understanding”. This isn’t quite right - there’s a certain form of mental agency that humans still do much better than AIs - but again, it’s a (large) difference in degree rather than in kind.

> I think this thesis has done well so far. So far, every time people have claimed there’s something an AI can never do without “real understanding”, the AI has accomplished it with better pattern-matching.

While I can't claim to have been quite as prescient as the author, I agree with his position.

It wasn't so long ago that our standard conception of AI was that it could never make anything that could be called "art"[2], and now we have model that churn out images in seconds and poetry that average people rate as better than most humans[3]

You have a whole, well-spoken argument, but it despite claiming otherwise, every point boils down to "in order to have a trait associated with human thought, it needs to be more like a human brain". Your reasoning is circular and all comes down to human exceptionalism.

> But they provide no means for cognition. And creativity requires cognition. Creativity is a conscious process, for it requires imagination, which is an offshoot of a conscious process.

"Consciousness", the philosophical term primarily infamous for the fact that no one understands or agrees how it works - only that humans have it and some other animals may or may not. But somehow you know that "imagination" (a term not otherwise defined or justified) requires it, and that cognition (again undefined except to assert that LLMs don't have it, despite being able to take in information and retain it) requires that, and therefore, LLMs can't be creative unless they are more human-like.

> At that point, we should inject agency. A reason to exist. With current life, the sole primary reason is "reproduce". Everything else has derived from that premise. I spoke of the mating urge, we should recreate this here.

Again, algorithms are infamous for following goals and chasing rewards even more intently than humans, but you only count that as a "purpose" if it's the same one humans have.

And so on.

[1] https://www.astralcodexten.com/p/now-i-really-won-that-ai-be... [2] https://knowyourmeme.com/memes/can-a-robot-write-a-symphony [3] https://www.nature.com/articles/s41598-024-76900-1