Comment by spoaceman7777

2 months ago

Hmm. I think you may be confusing sycophancy with simply following directions.

Sycophancy is a behavior. Your complaint seems more about social dynamics and whether LLMs have some kind of internal world.

Even "simply following directions" is something the chatbot will do, that a real human would not -- and that interaction with that real human is important for human development.