Comment by makebelievelol
6 hours ago
I think that's just a variation of grounding the LLM. They already have the personality written in the system prompt in a way. The issue is that when the conversation goes on long enough, they would "break character".
No comments yet
Contribute on Hacker News ↗