← Back to context

Comment by rahimnathwani

4 days ago

  Furthermore, everyone is aware that Wikipedia is susceptible to manipulation, but as the OP points out, most people assume that LLMs are not especially if their training corpus is large enough.

I'm not sure this is true. The opposite may be true.

Many people assume that LLMs are programmed by engineers (biased humans working at companies with vested interests) and that Wikipedia mods are saints.

I don't think anybody who has seen an edit war thinks wiki editors (not mods, mods have a different role) are saints.

But a Wikipedia page cannot survive stating something completely outside the consensus. Bizarre statements cannot survive because they require reputable references to back them.

There's bias in Wikipedia, of course, but it's the kind of bias already present in the society that created it.

  •   I don't think anybody who has seen an edit war thinks wiki editors (not mods, mods have a different role) are saints.
    

    I would imagine that fewer than 1% of people who view a Wikipedia article in a given month have knowingly 'seen an edit war'. If I'm right, you're not talking about the vast majority of Wikipedia users.

      But a Wikipedia page cannot survive stating something completely outside the consensus. Bizarre statements cannot survive because they require reputable references to back them.
    

    This is untrue. There are several high profile examples of false information persisting on Wikipedia:

    Wikipedia’s rules and real-world history show that 'bizarre' or outside-the-consensus claims can persist—sometimes for months or years. The sourcing requirements do not prevent this.

    Some high profile examples:

    - The Seigenthaler incident: a fabricated bio linking journalist John Seigenthaler to the Kennedy assassinations remained online for about 4 months before being fixed: https://en.wikipedia.org/wiki/Wikipedia_Seigenthaler_biograp...

    - The Bicholim conflict: a detailed article about a non-existent 17th-century war—survived *five years* and even achieved “Good Article” status: https://www.pcworld.com/article/456243/fake-wikipedia-entry-...

    - Jar’Edo Wens (a fake aboriginal deity), lasted almost 10 years: https://www.washingtonpost.com/news/the-intersect/wp/2015/04...

    - (Nobel-winning) novelist Philip Roth publicly complained that Wikipedia refused to accept his correction about the inspiration for The Human Stain until he published an *open letter in The New Yorker*. The false claim persisted because Wikipedia only accepts 'reliable' secondary sources: https://www.newyorker.com/books/page-turner/an-open-letter-t...

    Larry Sanger's 'Nine theses' explains the problems in detail: https://larrysanger.org/nine-theses/

    • Isn't the fact that there was controversy about these, rather than blind acceptance, evidence that Wikipedia self-corrects?

      If you see something wrong in Wikipedia, you can correct it and possibly enter a protracted edit war. There is bias, but it's the bias of the anglosphere.

      And if it's a hot or sensitive topic, you can bet the article will have lots of eyeballs on it, contesting every claim.

      With LLMs, nothing is transparent and you have no way of correcting their biases.

      5 replies →