Comment by griffzhowl
9 hours ago
Because sycophancy in humans is motivated not by the wellbeing of the person seeking advice, but by the interests of the sycophant in gaining favour.
It makes sense that this behaviour would be seen in LLMs, where the company optimizes towards of success of the chatbot rather than wellbeing of the users.
No comments yet
Contribute on Hacker News ↗