Comment by notachatbot123

8 days ago

That is true and also so sad and terrifying. A therapist is bound to serious privacy laws while a LLM company will happily gobble up all information a person feeds it. And the three-letter agencies are surely in the loop.

> A therapist is bound to serious privacy laws

A therapist can send to involuntary confinement if you give certain wrong answers to their questions, and is a mandatory reporter to the same law enforcement authorities you just described if you give another type of wrong answer. LLMs do neither of these and so are strictly better in that regard.

  • > A therapist can send to involuntary confinement if you give certain wrong answers to their questions

    That is not how that works at all.

I don't disagree with what you are saying but that ship has sailed decades ago.

Nobody in the tech area did anything meaningful to keep them at bay, like make a fully free search engine where it's prohibited by an actual law in an actual country to introduce ads or move data out of the data center, etc.

We were all too happy to just get the freebies. The bill comes due, always, though. And a bill is coming for several years now, on many different fronts.

Where are the truly P2P, end-to-end encrypted and decentralized mainstream internet services? Everyone is in Telegram or Whatsapp, some are in Signal. Every company chat is either in Slack or Teams. To have a custom email you need to convince Google and Microsoft not to mark your emails as spam... imagine that.

Again, the ship has sailed, long long time ago. Nobody did anything [powerful / meaningful] to stop it.

Nobody in the right mind is using cloud LLMs for "therapy", it's all done with local LLMs.

Latest local models that run on consumer-grade hardware can likely provide "good enough" resemblance of communication with human and offer situation-dependent advice.

Which btw is not a therapy in any way shape or form, but a way to think things through and see what various options are.

  • > Nobody in the right mind is using cloud LLMs for "therapy", it's all done with local LLMs.

    "Nobody with a working heart uses the cloud-based automatic heart transplant machine!"

  • To be honest, based on my personal experience with smaller models running on local consumer-grade hardware, I am more worried about the quality and therefore someone's mental health than privacy. So many small models can't even answer the exact question as the query gets more complex.