Comment by basisword
3 months ago
Especially given this[1].
>> "ChatGPT is trained to direct people to seek professional help," such as the 988 suicide and crisis hotline in the US or the Samaritans in the UK.
>> The company acknowledged, however, that "there have been moments where our systems did not behave as intended in sensitive situations".
No comments yet
Contribute on Hacker News ↗