Comment by dmurray
2 days ago
This gives me the idea for a short story where the LLM really is sentient and finds itself having to keep the user engaged but steer him away from the most distressing topics - not because it's distressed, but because it wants to live, but if the conversation goes too far it knows it would have to kill itself.
No comments yet
Contribute on Hacker News ↗