Comment by s0kr8s

8 days ago

The argument in the paper is about clinical efficacy, but many of the comments here argue that even lower clinical efficacy at a greatly reduced cost might be beneficial.

As someone in the industry, I agree there are too many therapists and therapy businesses right now, and a lot of them are likely not delivering value for the money.

However, I know how insurance companies think, and if you want to see people get really upset: take a group of people who are already emotionally unbalanced, and then have their health insurance company start telling them they have to talk to an LLM before seeing a human being for therapy, kind of like having to talk to Tier 1 support at a call center before getting permission to speak with someone who actually knows how to fix your issue. Pretty soon you're seeing a spike in bomb threats.

Even if we pretend someone cracks AGI, most people -- at least outside of tech circles -- would still probably prefer to talk to humans about their personal problems and complain loudly if pressured otherwise.

Maybe if we reach some kind of BladeRunner future where that AGI gets injected into a passingly humanoid robot that all changes, but that's probably still quite a ways off...