Comment by tracerbulletx
13 hours ago
We don't even know what the pre-requisites for consciousness are so we have no way of knowing. LLMs have emergent behavior that is reminiscent of language forming brains, but they're also missing a lot of properties that are probably necessary? Mainly continuity over time, more integrated memory, and a better sense of space and time? Brains use the rhythm and timing of neuronal firings, and the length of axons effects computation, they do a lot of different things with signal and patterns, but in any case without knowing what consciousness is I don't know which of those things are required.
> LLMs have emergent behavior that is reminiscent of language forming brains,
Indeed, but then we need to prove that they are not "chinese box" conscious. Which is hard, because it might be that the thing running the chinese box is conscious, but can only communicate in a way it doesn't understand
> We don't even know what the pre-requisites for consciousness are so we have no way of knowing.
Imo we don't even have a definition of the word that we agree on.
Ability to feel pain or pleasure is a good indicator I think..
That would be the physically embodied definition. Which is a useful starting point, because clearly our consciousness is physically embodied, while an LLM's isn't.
This matters more than it seems, because we're not calculators, and we're not just brains. There are proven links between mental and emotional states and - for example - the gut biome.
https://www.nature.com/articles/s41598-020-77673-z
There's a huge amount going on before we even get to the language parts.
As for Dawkins - as someone on Twitter pointed out, the man who devoted his life to telling people believers in sky fairies they were idiots has now persuaded himself there's a genie living inside a data centre, because it tells him he's smart.
If he'd actually understood critical thinking instead of writing popular books about it he wouldn't be doing this.
1 reply →
What about single celled or microscopic multi-cellular life forms? They could sense positive and negative aspects to their surroundings and move toward/away from said aspects. I don’t think most would include them as conscious despite this directed behavior.
There are times I am feeling neither pain nor pleasure, but I am still experiencing conciousness.
So that definition seems to fail immediately.
And how do you even measure pain, is it painful for an LLM to be reprimanded after generating a reply the user doesn't like? It seems to act like it.
2 replies →
And how do you define pain and pleasure? Do insects feel pain?
9 replies →
Now you have do define pleasure AND pain without using the word "consciousness" as that would be circular logic.
Is pleasure then any reward function? Then a mathematical set of equations performed by a human by writing on a piece of paper can qualify. Does that mean pen and paper is conscious? Or certain equations?
1 reply →
We're pretty clear on the distinction between a conscious and an unconscious human.
We might not clearly understand the diff between the two states but we can certainly point to it and go "it's that".
I'm not sure it's that clear. What about a person who is on drugs to the point they clearly don't know what reality is happening around them, but they are able to speak and move and such? I'm not sure I'd call that conscious, but by most definitions it is.
2 replies →
>We're pretty clear on the distinction between a conscious and an unconscious human.
You are using unconscious as a synonym for asleep, which is not the same thing as having no conscious experience due to dreams. We are clear on the distinction between a dead human and an alive human however.
1 reply →
Now discuss whether a bonobo, a dog, a cat, a mouse, an ant, a bacterium is conscious.
And you’ll find it’s not as clear cut.
Those terms are not really how we use the word "conscious" in any other situation though. With a definition like that you would say a rock is unconscious (I guess reasonable), a pretty cold bacteria is unconscious (hmm.. ok I guess?), and a warm bacteria is conscious (now I'm not on board anymore).
We have to be WAY more specific in what the word even means!
1 reply →
Clive Wearing's memory lasts for less than 30 seconds, so he has no memory of being awake before now. He is permanently in a state of feeling like he has just woken up, observing his surroundings for the first time.
Clive Wearing's mind has no time continuity and basically zero memory integration. Is he not conscious? There's interviews with the guy.
Where on the scale [No mind <-> Clive Wearing <-> Healthy human brain] would you put an LLM with a 10M token context window?