Comment by nurettin
2 months ago
It just aligns generated words according to the input. It is missing individual agency and self sufficiency which is a hallmark of consciousness. We sometimes confuse the responses with actual thought because neural networks solved language so utterly and completely.
Not sure I'd use those criteria, nor have I heard them described as hallmarks of consciousness (though I'm open, if you'll elaborate). I think the existence of qualia, of a subjective inner life, would be both necessary and sufficient.
Most concisely: could we ask, "What is it like to be Claude?" If there's no "what it's like," then there's no consciousness.
Otherwise yeah, agreed on LLMs.
I'd say being the maintainer of the weights is individual agency. Not just training new agents, but introspection. So autonomous management system would be pretty much conscious.
> It is missing individual agency and self sufficiency which is a hallmark of consciousness.
You can be completely paralyzed and completely concious.
Yes, but you can't be completely suspended with no sensory input or output, not even internally (i.e. hunger, inner pains, etc), and no desires, and still be conscious.
> no sensory input or output
Multimodal LLMs get input from cameras and text and generate output. They undergo reinforcement learning with some analogy to pain/pleasure and they express desires. I don't think they are conscious but I don't think they necessarily fail these proposed preconditions, unless you meant while they are suspended.
Yes, and you have individual agency while completely paralyzed.
Self sufficiency was one of his conditions.