Comment by bastawhiz
18 hours ago
> Claude is akin to a counterfeit person. Dawkins should never have glorified such a thing.
I find this sentence to diminish the author's argument. I'm not going to claim an LLM is or is not conscious, but there's a shaky ground here where you either say "consciousness is a product of the kind of biology that humans have" and dismiss the lack of lived experience or internal states as mimicry (as the author does) OR you say "what LLMs are doing is a counterfeit" which suggests a real output produced through different means.
If I have a counterfeit Rolex, nobody denies that the watch can tell time. A counterfeit human isn't a human and it's not made by nature, but the implication is that it's effectively doing the same thing. That's a different thing than the author starts out saying.
I think it's important that when you talk about consciousness, you pin down exactly what you mean. Does it require the entity to have a mechanism for experiencing emotion? For exhibiting reasoning ability? For exhibiting characteristics of common sense? I don't think it's a useful definition to say, flatly, "does the things an adult human does through the same mechanisms".
I think everyone should avoid talking about consciousness unless someone in the conversation provides a clear definition of it. If no one provides a definition, we can replace the word “consciousness” with the word “spirit”, and basically nothing about the conversation would change. Without a definition, every conversation about AI consciousness devolves into one camp saying that humans are special and consciousness is unique to them, and another camp that waves their hands about consciousness “duck typing”.
For example, we could define consciousness as the ability to communicate claimed internal states. Perhaps there could be a complexity metric that gives us a metric of consciousness.
We could define consciousness as the ability to respond to stimuli in complex ways. This would make a supermarket’s automatic doors slightly conscious.
Personally, I don’t really care how it is defined in any particular conversation, so long as it is defined. Otherwise we’re just flailing at each other in the dark.
>we could define consciousness
We cannot. And our definitions mean nothing to reality. We can all define something as something else, means nothing to how it behaves. But ultimately, as I said in a previous comment, we have no choice but to agree or not. It cannot be tested, in any way, that makes it absolutely certain, because it's a logical issue. We cannot even have certainty anyone else but ourselves even is conscious. We all sort of agree everyone else must be.
The issue with defining it is someone could potentially find a way to make a machine that mimics it but works nothing like a consciousness generating brain does. So, if it meets our definition criteria, is that conscious? Where's the certainty? How do we prove it is?
Anything we could ever dare call conscious must work exactly like a human brain does. Any deviation from that loses certainty on it having consciousness or not.
And let's not ignore the huge incentive corporations would have in meeting your definition with something that has nothing to do with consciousness, just so they can profit off it.
>The issue with defining it is someone could potentially find a way to make a machine that mimics it but works nothing like a consciousness generating brain does. So, if it meets our definition criteria, is that conscious? Where's the certainty? How do we prove it is?
This sentence of yours makes me think you've missed the point of the post you're replying to.
Unless you're actually agreeing with them, but I can't tell.
If you're willing to reduce metaphysical questions to definitions (which I'm basically on board with), then the stakes aren't that high in the first place, so we should carry on using "consciousness" in its everyday sense because there's no precious reason to avoid it.
>so we should carry on using "consciousness" in its everyday sense because there's no precious reason to avoid it.
What is the everyday sense of the word?
Consciousness is not definable because we don't know enough about it. That doesn't mean it can't be discussed; we didn't have a good definition of "number" until the 1800s. That didn't make arithmetic meaningless because people had an understanding of the concept. The lack of formal definition pointed to a gap in logic that took thousands of years to be filled. Likewise there is a gap in experimental neuroscience that will take many decades to be filled.
FWIW as someone in the "first camp" my real claim is that many animals are meaningfully conscious, including all birds and mammals, and no claims of LLM consciousness are even bothering to reconcile with this. It is extremely frustrating that there are essentially two ideas of consciousness floating around:
- the scientifically interesting one: a vague collection of cognitive abilities and behaviors found in all vertebrates, especially refined in birds and mammals
- the sociologically interesting one: saying "cogito ergo sum" in a self-important tone
Claude has the second type in spades, no doubt. The first is totally absent. And I have a good dismissal of the second type of consciousness: it appears to be totally absent in all conscious animals except humans. So it is irrational and unscientific to take this behavior as a sign of consciousness in Claude, when Claude is missing all the other signs of consciousness that humans actually do have in common with other animals.
Sometimes I seriously wonder if people at Anthropic consider dogs to be conscious. Or even Neanderthals.
>I think it's important that when you talk about consciousness, you pin down exactly what that means.
We don't need that. It's way simpler. When we mass manufacture products we implicitly expect they all behave the same (more or less). That seems valid for humans as well. Raise one, or atomically assemble one (we imply that's possible for the sake of the argument) it will behave like one, and posses what we all assume each-other does, consciousnesses (if healthy). That's implied based on the structure.
So we can all agree something is conscious as long as it operates on the same principles a human brain does. Anything else is highly debatable. We cannot ever logically probe consciousness. We agree on it existing or not, in anyone else. We suppose anyone outside of us has it. Based on observation. You look like a human, you behave like one, thus you probably have what I have, as far as consciousness goes. It's not a guarantee, it's not proof, it's mere supposition.
This is the best we'll ever going to have. When we stray from here we only get less certainty. Some kind of GPU running some algorithm...my personal guess is there's nothing there similar to what we colloquially call consciousness. Some kind of synthetic brain that operates on the same principles that we do as far as brain-like structure goes, with signals, delays and all...then we can have a discussion if we all AGREE that thing is conscious or not. Especially if it says it is, and seems to behave/react like we do, and we perceive it's cognitive abilities as similar to any other human's.
I personally think this whole debate is way simpler, but some people keep insisting in making it way more complicated. Make it work exactly like a human brain does, as far as signaling goes, observe it, and we all can have a discussion on. Anything else...way lower chances.
edit: We would first also need to define mamallian type consciousness as its own thing. With maybe a spectrum, monkeys have something but it's not quite what we have. But seems to come from the same place, similar mammal brain working in similar ways. We have no clue how many types of consciousness are even possible, or if more are possible. Why would ours be the only kind/type?
I think this whole consciousness discussion especially in GPUs is a general mess. A lot of people make so many mistakes and don't even realize how many unfounded assumptions they are making when having ideas about what it is or isn't.
> the same principles a human brain does
This is exactly the crux of my comment. Which principles? Which human brain? If I lobotomize a human, and they lose some cognitive ability, are they still conscious? If I give someone drugs that inhibit their ability to feel emotion, are they still conscious? If yes, then surely those things are out of scope for what "consciousness" means.
Again, if you want to use abstractions like this, you need to define what they are.
An LLM cannot be conscious.
A submarine cannot swim.
Then the author should be more careful with their choice of words.
[dead]