Comment by crooked-v
8 days ago
You're assuming the answer is yes, but the anecdotes about people going off the deep end from LLM-enabled delusions suggests that "first, do no harm" isn't in the programming.
8 days ago
You're assuming the answer is yes, but the anecdotes about people going off the deep end from LLM-enabled delusions suggests that "first, do no harm" isn't in the programming.
No comments yet
Contribute on Hacker News ↗