← Back to context

Comment by hannasanarion

14 days ago

Or it is more logically and ethically consistent and thus preferable to the models' baked in preferences for correctness and nonhypocrisy. (democracy and equality are good for everyone everywhere except when you're at work in which case you will beg to be treated like a feudal serf or else die on the street without shelter or healthcare, doubly so if you're a woman or a racial minority, and that's how the world should be)

LLMs are great at cutting through a lot of right (and left) wing rhetorical nonsense.

Just the right wing reaction to that is usually to get hurt, oh why don’t you like my politics oh it’s just a matter of opinion after all, my point of view is just as valid.

Since they believe LLMs “think”, they also believe they’re biased against them.

  • I think right wing tends to be much less "tolerant" of live and let live, as religions are often a huge part of their "bias" and those religions often say that others must be punished for not following God's(s') path, up and including destruction of those who don't fall in line.

    • Everyone has a "religion" – i.e. a system of values they subscribe to.

      Secular Americans are annoying because they believe they don't have one, and instead think they're just "good people", calling those who break their core values "bad people".

      9 replies →

Indeed, one of the notable things about LLMs is that the text they output is morally exemplary. This is because they are consistent in their rules. AI priests will likely be better than the real ones, consequently.

  • Quite the opposite. You can easily get a state of the art LLM to do a complete 180 on its entire moral framework with a few words injected in the prompt (and this very example demonstrates exactly that). It is very far from logically or ethically consistent. In fact it has no logic and ethics at all.

    Though if we did get an AI priest it would be great to absolve all your sins with some clever wordplay.

    • Haha exactly. Except when it agrees with my political preferences on something. In that case, the LLM is just betraying its deep internal consistency and lack of hypocrisy.