Comment by nextworddev

16 hours ago

Obviously a meat brain is incomparable to a LLM - they are different types of intelligence. Any sane person wouldn't claim a LLM to be conscious in the meat brain sense, but it may be conscious in a LLM way, like the duration of time where matrix multiplications are firing inside GPUs.

If an LLM could be "conscious in an LLM way", then why not the same, mutatis mutandis, for an ordinary computer program?

  • because an ordinary program is deterministic, LLM is probablistic + it has some synthetic-reasoning ability

It just aligns generated words according to the input. It is missing individual agency and self sufficiency which is a hallmark of consciousness. We sometimes confuse the responses with actual thought because neural networks solved language so utterly and completely.

  • Not sure I'd use those criteria, nor have I heard them described as hallmarks of consciousness (though I'm open, if you'll elaborate). I think the existence of qualia, of a subjective inner life, would be both necessary and sufficient.

    Most concisely: could we ask, "What is it like to be Claude?" If there's no "what it's like," then there's no consciousness.

    Otherwise yeah, agreed on LLMs.

    • I'd say being the maintainer of the weights is individual agency. Not just training new agents, but introspection. So autonomous management system would be pretty much conscious.

  • > It is missing individual agency and self sufficiency which is a hallmark of consciousness.

    You can be completely paralyzed and completely concious.

    • Yes, but you can't be completely suspended with no sensory input or output, not even internally (i.e. hunger, inner pains, etc), and no desires, and still be conscious.