Comment by bxguff
5 hours ago
Is anybody shocked that when prompted to be a psychotherapy client models display neurotic tendencies? None of the authors seem to have any papers in psychology either.
5 hours ago
Is anybody shocked that when prompted to be a psychotherapy client models display neurotic tendencies? None of the authors seem to have any papers in psychology either.
There is nothing shocking about this, precisely, and yes, it is clear by how the authors are using the word "psychometric" that they don't really know much about psychology research either.
I'm not shocked at all. This is how the tech works at all, word prediction until grokking occurs. Thus like any good stochastic parrot, if it's smart when you tell it it's a doctor, it should be neurotic when you tell it it's crazy. it's just mapping to different latent spaces on the manifold
I think popular but definitely-fictional characters are a good illustration: If the prompt describes a conversation with Count Dracula living in Transylvania, we'll percieve a character in the extended document that "thirsts" for blood and is "pained" by sunlight.
Switching things around so that the fictional character is "HelperBot, AI tool running in a datacenter" will alter things, but it doesn't make those qualities any less-illusory than CountDraculatBot's.