Comment by danudey
18 hours ago
> I wouldn't call that "basic".
"Basic" is relative. Nothing about LLMs is basic; it's all insanely complex, but in the context of a list of requirements "Don't tell people with signs of mental illness that they're definitely not mentally ill" is kind of basic.
> I'm trying to imagine what kind of safety measures would have stopped this, and nothing short of human supervisors monitoring all chats comes to mind.
Maybe this is a problem they should have considered before releasing this to the world and announcing it as the biggest technological revolution in history. Or rather I'm sure they did consider it, but they should have actually cared rather than shrugging it off in pursuit of billions of dollars and a lifetime of fame and fortune.
No comments yet
Contribute on Hacker News ↗