Comment by etherealG

3 hours ago

And what I find fascinating is I see similar mimicking by my 5 year old. Perhaps we shouldn’t be so quick to call this a lack of being genuine. Sometimes emotions are learned in humans but we wouldn’t call them fake.

I don’t want to declare machines to have emotion outright, but to call mimicry evidence of falsehood is also itself false.

Mimicry is how kids learn the expected reactions to particular emotions. A kid mimicking your surprise doesn’t mean they are surprised (as surprise requires an existing expectation of an outcome they may not have the experience for), but when they do feel genuine surprise, they’ll know how to express it.

  • How do we know that AI isn't feeling genuine surprise then?

    • Because it has no mind, no cognition, and nothing to "feel" with. Don't mistake programmatic mimicry for intention. That's just your own linguistic-forward primate cognition being fooled by the linguistic signals the training set and prompt are making the AI emit.

    • Because it's a statistical process generating one part of a word at a time. It probably isn't even generating "surprise". It might be generating "sur", then "prise" then "!"

      1 reply →

most emotions in humans are learnt in self exploration, this is more obvious in kids.

first there is only good and bad, then more nuanced emotions based on increased understanding of the context in which they arise