Comment by brookst
13 hours ago
These LLMs don’t have senses, they have a token stream. They have no experience of the world outside of the language tokens they operate on.
I’m not sure I believe that consciousness emerges from sensory experience, but if it does, LLMs won’t get it.
How do you know the sensation of a red photon hitting a cone cell, transduced to the optic nerve through ion junctions and processed by pyramidal neurons, is any more or less real than the excitation of electrons in a doped silicon junction activating the latent space of the "red" thought vector? Cause we are made of meat?
You’re arguing against the opposite of my position. I am arguing that LLMs have a reasonable basis to be seen as conscious because there is nothing special about biological neural networks.
Sensory input is nothing but data.
That's just reductive semantics. Anything can be described as "nothing but data".
Sensory data is a specific data set that corresponds to phenomena in the world. But to say that LLMs don’t have senses merely because they are linguistic or computational doesn’t follow when they can take in data from the world that similarly reflects something about the world.
6 replies →
How do you imagine a brain can distinguish data from a real sense and data from another source?
Neural networks can have senses. Hook an LLM up to a thermometer and it will respond to temperature changes.
No, it will respond to tokens telling it about a temperature change. It has no sense of warmth. It cannot be burned.
Conflating senses with cognitive awareness of sensory input is a mistake.
The human Brain is a neural network. Your sense of “knowing what warmth is” reduces down to the weights of connections between neurons in an analog of LLMs. What is different about the human brain that warrants saying that the same emergent characteristics for one network are inaccessible to another?
I’m not sure I fully understand the distinction you’re making, or if I do I’m not sure I agree. Concretely, I agree that these are very different mechanisms. Abstractly… I agree that an LLM cannot be burned. I’m not sure I agree, though, that there is a significant conceptual difference between thermoreceptors in the skin causing action potentials to make their way up the spinal cord to the brain is all that different than reading a temperature sensor over I2C and turning it into input tokens.
Edit: what they don’t have, obviously, is a hard-coded twitch response, where the brain itself is largely bypassed and muscles react to massive temperature differentials independently of conscious thought. But I don’t think that defines consciousness either. Ants instinctively run away from flames too.