Comment by crooked-v
2 days ago
> I think Eliezer’s take here is extremely bad
Same here.
I think fundamentally it's very simple, with no need for Yudkowsky's weird conspiratorial tone: current LLMs are very effective at being blind sycophancy machines completely unanchored from reality, and human psychology just isn't evolved to handle an endless stream of sycophancy and flattery like that.
No comments yet
Contribute on Hacker News ↗