← Back to context

Comment by jychang

5 days ago

Yeah, this is why wikipedia is not a good resource and nobody should use it. Also why google is not a good resource, anybody can make a website.

You should only trust going into a library and reading stuff from microfilm. That's the only real way people should be learning.

/s

So, do you want to actually have a conversation comparing ChatGPT to Google and Wikipedia, or do you just want to strawman typical AI astroturfing arguments with no regard to the context above?

Ironic as you are answering someone who talked about correcting a human who blindly pasted an answer to their question with no human verification.

  • > So, do you want to actually have a conversation comparing ChatGPT to Google and Wikipedia, or do you just want to strawman typical AI astroturfing arguments with no regard to the context above?

    Dunno about the person you're replying to (especially given the irony re that linked reddit thread), but I would like to actually have a conversation (or even just a link to someone else's results) comparing ChatGPT to Google and Wikipedia.

    I've met people who were proudly, and literally, astroturfing Wikipedia for SEO reasons. Wikipedia took a very long time to get close to reliable, editors now requiring citations for claims etc., and I still sometimes notice pairs of pages making mutually incompatible claims about the same thing but don't have the time to find out which was correct.

    Google was pretty reliable for a bit, but for a while now the reliability of its results has been the butt of jokes.

    That doesn't mean any criticisms of LLMs are incorrect! Many things can all be wrong, and indeed are. Including microfilm and books and newspapers of record. But I think it is fair to compare them — even though they're all very different, they're similar enough to be worth comparing.

    • >Wikipedia took a very long time to get close to reliable,

      And that's a good thing to remember. Always be skeptical and know the strengths and weaknesses of your sources. Teachers taught me (and maybe you) to be skeptical and not use Wikipedia as a citation for a reason. Even today, it is horrible for covering current events, and recent historical opinions can massively fluctuate. That isn't me dismissing Wikipedia as a whole, nor saying it has no potential.

      >Google was pretty reliable for a bit, but for a while now the reliability of its results has been the butt of jokes.

      Yes, more reason to be scrutinous. It's a bit unfortunate how oftentimes it's the 3-5th result that is more reliable than the first SEO optimized slop that won the race. Not unless I am using very specific queries.

      ---

      Now let's consider these chat bots. There's no sense of editorial overview, they are not deterministic, and they are known to constantly hallucinate instead of admit ignorance. There does not seem to be any real initiative to fix such behavior, but instead ignore it and dismiss it as "the tech will get better".

      Meanwhile, we saw the most blatant piece of abuse last week when Grok was update, to show that these are not some impartial machines simply synthesizing existing information. They can be tweak to private estate's whims the same way a search algorithm or biased astroturfer can do with the other two subjects of comparison. There's clear flaws and no desire nor push to really fix them; simply casting it off as a bug to fix instead of a societal letdown it should be viewed as.

      2 replies →

Ah yes, the thing that told people to administer insulin to someone experiencing hypoglycemia (likely fatal BTW) is nothing like a library or Google search, because people blindly believe the output because of the breathless hype.

See Dunning-Kruger.