Comment by devoutsalsa

7 days ago

One obvious limitation of LLMs is censorship & telling you what you want to hear. A therapist can say, "I'm going to be honest with you, <insert something you need to hear here>". An LLM isn't going to do that, and it probably shouldn't do that. I think it's fine to treat LLM advice like you'd receive from a friend, meaning it's just something to think about and should not be treated as professional advice. It's not going to diagnose you with an issue that would be obvious to a therapist, but not from the prompts you give it. For example, if you're wondering why you can't attract a member of the opposite sex, a therapist my notice you have poor hygiene and dress like a hobo.

Therapists are (or should be, if they’re any good) very good at recognizing when a patient is giving false information, dodging key topics, or trying to manipulate the therapist. Very common for patients to try to hide things from the therapist or even lie, even though that’s counter to the goals of therapy.

LLMs won’t recognize this. They are machines to take input and produce related output that looks correct. It’s not hard to figure out how to change your words and press the retry button until you get the answer you want.

It’s also trivial to close the chat and start a new one if the advice starts feeling like it’s not what you want to hear. Some patients can quit human therapists and get new ones on repeat, but it takes weeks and a lot of effort. With an LLM it’s just a click and a few seconds and that inconvenient therapy note is replaced with a blank slate to try again for the desired answer.

  • I think this is a valid point. At the same time a user that wants to talk or pour his inside out so an emphatic listener might still benefit from a LLM.