Comment by nurettin

11 hours ago

It just aligns generated words according to the input. It is missing individual agency and self sufficiency which is a hallmark of consciousness. We sometimes confuse the responses with actual thought because neural networks solved language so utterly and completely.

Not sure I'd use those criteria, nor have I heard them described as hallmarks of consciousness (though I'm open, if you'll elaborate). I think the existence of qualia, of a subjective inner life, would be both necessary and sufficient.

Most concisely: could we ask, "What is it like to be Claude?" If there's no "what it's like," then there's no consciousness.

Otherwise yeah, agreed on LLMs.

  • I'd say being the maintainer of the weights is individual agency. Not just training new agents, but introspection. So autonomous management system would be pretty much conscious.

> It is missing individual agency and self sufficiency which is a hallmark of consciousness.

You can be completely paralyzed and completely concious.

  • Yes, but you can't be completely suspended with no sensory input or output, not even internally (i.e. hunger, inner pains, etc), and no desires, and still be conscious.