← Back to context

Comment by rpastuszak

14 days ago

I don't think that's true. It's more of a function on how these models are trained (remember the older pre-ChatGPT clients?)

Most of the software I use doesn't need to refer it itself in the first person. Pretending what we're speaking with an agent is more of a UX/marketing decision rather than a technical/logical constraint.

I'm not sure about that. What happens if you "turn down the weight" (cf. https://www.anthropic.com/news/golden-gate-claude) for self-concept, expressed in the use not of first-person pronouns but "the first person" as a thing that exists? Do "I" and "me" get replaced with "this one" like someone doing depersonalization kink, or does it become like Wittgenstein's lion in that we can no longer confidently parse even its valid utterances? Does it lose coherence entirely, or does something stranger happen?

It isn't an experiment I have the resources or the knowledge to run, but I hope someone does and reports the results.