← Back to context

Comment by AdamN

1 month ago

> Those are still feelings reserved only for real human beings

Those aren't feelings, they are words associated with a negative outcome that resulted from the actions of the subject.

you could argue that feelings are the same thing, just not words

  • That would be a silly argument because feelings involve qualia, which we do not currently know how to precisely define, recognize or measure. These qualia influence further perception and action.

    Any relationships between certain words and a modified probabilistic outcome in current models is an artifact of the training corpus containing examples of these relationships.

    I contend that modern models are absolutely capable of thinking, problem-solving, expressing creativity, but for the time being LLMs do not run in any kind of sensory loop which could house qualia.

    • One of the worst or most uncomfortable logical outcomes of

      > which we do not currently know how to precisely define, recognize or measure

      is that if we don't know if something has qualia (despite externally showing evidence of it), morally you should default to treating it like it does.

      Ridiculous to treat a computer like it has emotions, but breaking down the problem into steps, it's incredibly hard to avoid that conclusion. "When in doubt, be nice to the robot".

      14 replies →

    • > qualia, which we do not currently know how to precisely define, recognize or measure

      > which could house qualia.

      I postulate this is a self-negating argument, though.

      I'm not suggesting that LLMs think, feel or anything else of the sort, but these arguments are not convincing. If I only had the transcript and knew nothing about who wiped the drive, would I be able to tell it was an entity without qualia? Does it even matter? I further postulate these are not obvious questions.

      12 replies →

    • qualia may not exist as such. they could just be essentially 'names' for states of neurons that we mix and match (like chords on a keyboard. arguing over the 'redness' of a percept is like arguing about the C-sharpness of a chord. we can talk about some frequencies but that's it.) we would have no way of knowing otherwise since we only perceive the output of our neural processes, and don't get to participate in the construction of these outputs, nor sense them happening. We just 'know' they are happening when we achieve those neural states and we identify those states relative to the others.

      10 replies →

    • "It's different. I can't say why it's different, except by introducing a term that no one knows how to define," isn't the ironclad meat defense you were perhaps hoping it was.

    • > That would be a silly argument because feelings involve qualia, which we do not currently know how to precisely define, recognize or measure.

      If we can't define, recognize or measure them, how exactly do we know that AI doesn't have them?

      I remain amazed that a whole branch of philosophy (aimed, theoretically, at describing exactly this moment of technological change) is showing itself up as a complete fraud. It's completely unable to describe the old world, much less provide insight into the new one.

      I mean, come on. "We've got qualia!" is meaningless. Might as well respond with "Well, sure, but AI has furffle, which is isomporphic." Equally insightful, and easier to pronounce.

      46 replies →

  • Feelings have physical analogs which are (typically) measurable, however. At least without a lot of training to control.

    Shame, anger, arousal/lust, greed, etc. have real physical ‘symptoms’. An LLM doesn’t have that.

    • LLMs don't really exist physically (except in the most technical sense), so point is kind of moot and obvious if you accept this particular definition of a feeling.

      LLMs are not mammals nor animals, expecting them to feel in a mammalian or animal way is misguided. They might have a mammalian-feeling-analog just like they might have human-intelligence-analog circuitry in the billions (trillions nowadays) of parameters.

      1 reply →