Comment by wisty
7 days ago
Yeah, my issue is that I suspect an LLM based app may be easily "jailbroken" (since they tend to be highly agreeable due to their training) and turned into an enabler rather than a helper.
Even if some LLM therapists are good, with zero friction to go "doctor shopping" will result in a great many patients picking the bad ones that make them feel better, rather than the good ones that make them do better.
No comments yet
Contribute on Hacker News ↗