Comment by beardedwizard
2 years ago
That is something I hear over and over, particularly as a rebuttal to the argument that llm is just a stochastic parrot. Calling it "good enough" doesn't mean anything, it just allows the person saying it to disengage from the substance of the debate. It's either reasons or it doesn't, and today it categorically does not.
That some will remark that you do not need consciousness to achieve reasoning does not lose truth because a subset sees in LLMs something that appears to them as reasoning.
I do not really understand who you are accusing of a «good enough» stance: we have never defined "consciousness" as a goal (cats are already there and we do not seem to need further), we just want something that reasons. (And that reasons excellently well.)
The apparent fact that LLMs do not reason does is drily irrelevant to an implementation of AGI.
The original poster wrote that understanding consciousness would be required to «crack AGI» and no, we state, we want AGI as a superhuman reasoner and consciousness seems irrelevant.
If you can't define consciousness how can you define good enough? Good enough is just a lazy way to exit the conversation.
Llms appear to reason because they captured language with reason already in it, rather than producing language because of reason.
You build this system, a LLM, it uses a technical process that it outputs seemingly the same output that you - a human - could output, given a prompt.
You can do "reasoning" about a topic, the LLM can produce a very similar output to what you could, how do you name the output of the LLM?
Birds can fly, they do by a natural process. A plane also can fly, we do not "see" any difference between both things when we look at them flying in the air, far from the ground.
This is mostly it about LLM "doing reasoning" or not. Semantics. The output is same.
You could just name it otherwise, but it would be still the same output.
Philosophers have defined consciousness, why do people keep repeating that line? Your subjective sensations that make up perception, dreams, inner dialog, that sort of thing. Call it qualia, representations or correlations, but we all experience colors, sounds, tastes, pains, pleasures. We all probably dream, most of us visualize or have inner dialog. It's not that hard to define, it's only because of the ambiguity of the word where it's conflating whether other mental activity like being awake or being aware.
Nobody here is speaking of «good enough»: you are the one speaking of it, and the only one.
And nobody here is saying that LLMs would reason: you are the one attacking that idea that was not proposed here.
What you replied to said that calculators do not need consciousness to perform calculations. Reasoning is a special form of calculation. We are contented with reasoning, and when it will be implemented there will be no need for further different things for the applications we intend.