Comment by jillesvangurp
8 days ago
There's also the notion that some people have a hard time talking to a therapist. The barrier to asking an LLM some questions is much lower. I know some people that have professional backgrounds in this that are dealing with patients that use LLMs. It's not all that bad. And the pragmatic attitude is that whether they like it or not, it's going to happen anyway. So, they kind of have to deal with this stuff and integrate it into what they do.
The reality with a lot of people that need a therapist, is that they are reluctant to get one. So those people exploring some issues with an LLM might actually produce positive results. Including a decision to talk to an actual therapist.
That is true and also so sad and terrifying. A therapist is bound to serious privacy laws while a LLM company will happily gobble up all information a person feeds it. And the three-letter agencies are surely in the loop.
> A therapist is bound to serious privacy laws
A therapist can send to involuntary confinement if you give certain wrong answers to their questions, and is a mandatory reporter to the same law enforcement authorities you just described if you give another type of wrong answer. LLMs do neither of these and so are strictly better in that regard.
> A therapist can send to involuntary confinement if you give certain wrong answers to their questions
That is not how that works at all.
I don't disagree with what you are saying but that ship has sailed decades ago.
Nobody in the tech area did anything meaningful to keep them at bay, like make a fully free search engine where it's prohibited by an actual law in an actual country to introduce ads or move data out of the data center, etc.
We were all too happy to just get the freebies. The bill comes due, always, though. And a bill is coming for several years now, on many different fronts.
Where are the truly P2P, end-to-end encrypted and decentralized mainstream internet services? Everyone is in Telegram or Whatsapp, some are in Signal. Every company chat is either in Slack or Teams. To have a custom email you need to convince Google and Microsoft not to mark your emails as spam... imagine that.
Again, the ship has sailed, long long time ago. Nobody did anything [powerful / meaningful] to stop it.
Nobody in the right mind is using cloud LLMs for "therapy", it's all done with local LLMs.
Latest local models that run on consumer-grade hardware can likely provide "good enough" resemblance of communication with human and offer situation-dependent advice.
Which btw is not a therapy in any way shape or form, but a way to think things through and see what various options are.
> Nobody in the right mind is using cloud LLMs for "therapy", it's all done with local LLMs.
"Nobody with a working heart uses the cloud-based automatic heart transplant machine!"
To be honest, based on my personal experience with smaller models running on local consumer-grade hardware, I am more worried about the quality and therefore someone's mental health than privacy. So many small models can't even answer the exact question as the query gets more complex.
"Nobody in their right mind.."
Um, did you forget we were talking about therapy? :)
2 replies →