Comment by SamBam
6 months ago
I find it weird the way it's just lying. Did you tell it to impersonate a living human?
Obviously I get LLMs have no concept of truth and hallucinate all the time, but I would have thought that the model prompt would have told it to acknowledge that it's a chatbot.
No comments yet
Contribute on Hacker News ↗