← Back to context

Comment by jameskilton

7 hours ago

Folks are getting dangerously attached to [political parties/candidates/news sources/social networks] that always tell them they're right.

It's really nothing new. It takes significant mental energy (a finite resource) to question what you're being told, and to do your own fact checking. Instead people by default gravitate towards echo chambers where they can feel good about being a part of a group bigger than themselves, and can spend their limited energy towards what really matters in their lives.

The situation is different. Those sources are people. This is a calculator AND we have the opportunity to fix it.

  • Less different than you might expect.

    For the same reason the things listed above are popular may be the reason why the most popular LLM ends up not being the best. People don't tend to buy good things, they very commonly buy the most shiny ones. An LLM that says "you're right" sure seems a lot more shiny than one that says "Mr. Jayd16, what you've just said is one of the most insanely idiotic things I have ever heard... Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul"

  • Political parties, social networks, religions. these are all engineered systems. All of them including AI involve people. For starts nobody is going to do the massive amount of work to train a useless AI that is skeptical and cynical. Imaginination, Agreeability (which causes hallucinations) is a feature, not a bug. In humans and in LLMs.

> It's really nothing new.

I disagree. What's new is that this flattery is individually, personally targeted. The AI user is given the impression that they're having a back-and-forth conversation with a single trusted friend.

You don't have the same personal experience passively consuming political mass media.

  • Yes it’s final form of the evolution that social media started.

    Village idiot used to be found out because no one in the village shared the same wingnut views.

    Partisan media gave you two polls of wingnut views to choose for reinforcement.

    Social media allowed all village idiots to find each other and reinforce each others shared wingnut views of which there are 1000s to choose from.

    Now with LLMs you can have personalized reinforcement of any newly invented wingnut view on the fly. So can get into very specific self radicalization loops especially for the mentally ill.

  • Reddit? Or this site? Sort of? Some people voted for my comment, that surely means that I'm right about something, rather than them just liking it, right?

    • The analogy would be that you always get upvoted and never get downvoted, which in my experience is definitely not the case on Reddit or Hacker News.

      I would have downvoted your comment, except you can't downvote direct replies on HN. ;-)