← Back to context

Comment by i80and

8 hours ago

If you have unusual self-discipline and mental rigor, yes, you can use LLMs as a rubber duck that way. I would be severely skeptical of the value over a diary. But humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies.

The more subjective the topic, the more volatile the user's state of mind, the more likely they are to gaze too deep into that face on the other side of their funhouse mirror and think it actually is their friend, and that it "thinks" like they do.

I'm not even anti-LLM as an underlying technology, but the way chatbot companies are operating in practice is kind of a novel attack on our social brains and it behooves a warning!

>humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies

Interesting, not part of my experience really (though I'll need to reflect on it); thanks for sharing. It's a little like when people discover their aphantasia isn't the common experience of most other people. I tend towards strong skepticism (I'm fond of pyrrhonism), but assume others to be weakly sceptical rather than blindly accepting.

>humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies

Interesting, not part of my experience really (though I'll need to reflect on it); thanks for sharing. It's a little like when people discover their aphantasia isn't the common experience of most other people.