Comment by lowsong
8 days ago
Therapy is one of the most dangerous applications you could imagine for an LLM. Exposing people who already have mental health issues, who are extremely vulnerable to manipulation or delusions to a machine that's designed to to produce human-like text is so obviously risky it boggles the mind that anyone would even consider it.
Everyone is already using LLMs for therapy. The should argument is moot.
Then it needs to be banned and prosecuted as medical malpractice. The "should" argument is very much not moot.
Prosecute the person using the LLM or the LLM?
3 replies →