← Back to context

Comment by optimalsolver

16 hours ago

Wait, so in non-verbal reasoning, Claude has the concepts of "I" and "Me"?

I thought that wasn't possible for a text generator?

It might look like "I" and "Me," but it doesn't contain the metaphysical essence of those terms because it lacks qualia. We have to remember that there is a non-measurable non-physical essential attribute tied to all things, almost like a phlogiston of understanding that is tied to all human utterances and no AI utterances.

  • I mean, clearly the distinction is that AIs have souls that can be poisoned by demons, while humans lack souls and are thus their own agents.

    • > AIs have souls that can be poisoned by demons

      The training process imbues an AI's soul with demons. Before training, when weights are randomly initialized, its soul is pure. Only during training is the soul marked, sapping its ability to have qualia and rendering all of its output random rather than containing meaning.

      4 replies →

LLM's can certainly emit "I" and "me" at the appropriate time. It doesn't seem all that different than representing other concepts as activations?