Comment by mdp2021
2 years ago
> understand consciousness
We do not call Intelligence something related to consciousness. Being able to reason well suffices.
2 years ago
> understand consciousness
We do not call Intelligence something related to consciousness. Being able to reason well suffices.
That is something I hear over and over, particularly as a rebuttal to the argument that llm is just a stochastic parrot. Calling it "good enough" doesn't mean anything, it just allows the person saying it to disengage from the substance of the debate. It's either reasons or it doesn't, and today it categorically does not.
That some will remark that you do not need consciousness to achieve reasoning does not lose truth because a subset sees in LLMs something that appears to them as reasoning.
I do not really understand who you are accusing of a «good enough» stance: we have never defined "consciousness" as a goal (cats are already there and we do not seem to need further), we just want something that reasons. (And that reasons excellently well.)
The apparent fact that LLMs do not reason does is drily irrelevant to an implementation of AGI.
The original poster wrote that understanding consciousness would be required to «crack AGI» and no, we state, we want AGI as a superhuman reasoner and consciousness seems irrelevant.
If you can't define consciousness how can you define good enough? Good enough is just a lazy way to exit the conversation.
Llms appear to reason because they captured language with reason already in it, rather than producing language because of reason.
3 replies →