Comment by nkrisc

3 hours ago

Mimicry is how kids learn the expected reactions to particular emotions. A kid mimicking your surprise doesn’t mean they are surprised (as surprise requires an existing expectation of an outcome they may not have the experience for), but when they do feel genuine surprise, they’ll know how to express it.

How do we know that AI isn't feeling genuine surprise then?

  • Because it has no mind, no cognition, and nothing to "feel" with. Don't mistake programmatic mimicry for intention. That's just your own linguistic-forward primate cognition being fooled by the linguistic signals the training set and prompt are making the AI emit.

  • Because it's a statistical process generating one part of a word at a time. It probably isn't even generating "surprise". It might be generating "sur", then "prise" then "!"

    • But what is surprise really? Something not following expectation. The distribution may statistically leverage surprise as a concept via how it has seen surprise as a concept e.g. "interesting!"

      So it can be both true that it has nothing to do with the emotion of surprise, but appear as the emulation of that emotion since the training data matches the concept of surprise (mismatch between expectation and event).