← Back to context

Comment by peterlk

16 hours ago

I think everyone should avoid talking about consciousness unless someone in the conversation provides a clear definition of it. If no one provides a definition, we can replace the word “consciousness” with the word “spirit”, and basically nothing about the conversation would change. Without a definition, every conversation about AI consciousness devolves into one camp saying that humans are special and consciousness is unique to them, and another camp that waves their hands about consciousness “duck typing”.

For example, we could define consciousness as the ability to communicate claimed internal states. Perhaps there could be a complexity metric that gives us a metric of consciousness.

We could define consciousness as the ability to respond to stimuli in complex ways. This would make a supermarket’s automatic doors slightly conscious.

Personally, I don’t really care how it is defined in any particular conversation, so long as it is defined. Otherwise we’re just flailing at each other in the dark.

>we could define consciousness

We cannot. And our definitions mean nothing to reality. We can all define something as something else, means nothing to how it behaves. But ultimately, as I said in a previous comment, we have no choice but to agree or not. It cannot be tested, in any way, that makes it absolutely certain, because it's a logical issue. We cannot even have certainty anyone else but ourselves even is conscious. We all sort of agree everyone else must be.

The issue with defining it is someone could potentially find a way to make a machine that mimics it but works nothing like a consciousness generating brain does. So, if it meets our definition criteria, is that conscious? Where's the certainty? How do we prove it is?

Anything we could ever dare call conscious must work exactly like a human brain does. Any deviation from that loses certainty on it having consciousness or not.

And let's not ignore the huge incentive corporations would have in meeting your definition with something that has nothing to do with consciousness, just so they can profit off it.

  • >The issue with defining it is someone could potentially find a way to make a machine that mimics it but works nothing like a consciousness generating brain does. So, if it meets our definition criteria, is that conscious? Where's the certainty? How do we prove it is?

    This sentence of yours makes me think you've missed the point of the post you're replying to.

    Unless you're actually agreeing with them, but I can't tell.

If you're willing to reduce metaphysical questions to definitions (which I'm basically on board with), then the stakes aren't that high in the first place, so we should carry on using "consciousness" in its everyday sense because there's no precious reason to avoid it.

  • >so we should carry on using "consciousness" in its everyday sense because there's no precious reason to avoid it.

    What is the everyday sense of the word?

Consciousness is not definable because we don't know enough about it. That doesn't mean it can't be discussed; we didn't have a good definition of "number" until the 1800s. That didn't make arithmetic meaningless because people had an understanding of the concept. The lack of formal definition pointed to a gap in logic that took thousands of years to be filled. Likewise there is a gap in experimental neuroscience that will take many decades to be filled.

FWIW as someone in the "first camp" my real claim is that many animals are meaningfully conscious, including all birds and mammals, and no claims of LLM consciousness are even bothering to reconcile with this. It is extremely frustrating that there are essentially two ideas of consciousness floating around:

- the scientifically interesting one: a vague collection of cognitive abilities and behaviors found in all vertebrates, especially refined in birds and mammals

- the sociologically interesting one: saying "cogito ergo sum" in a self-important tone

Claude has the second type in spades, no doubt. The first is totally absent. And I have a good dismissal of the second type of consciousness: it appears to be totally absent in all conscious animals except humans. So it is irrational and unscientific to take this behavior as a sign of consciousness in Claude, when Claude is missing all the other signs of consciousness that humans actually do have in common with other animals.

Sometimes I seriously wonder if people at Anthropic consider dogs to be conscious. Or even Neanderthals.