Comment by palmotea

11 hours ago

> I have to admit, nowadays Google AI Overview's accuracy is so good that I often don't check the links. It's scary that it got from 'practically useless' to 'the actual google search' in less than two years.

It says things I know to be false fairly regularly. I don't keep a log or anything, but it's left an impression that it's far from reliable.

For my anecdote, I don't frequently deign to look at the overview at all... but every time I have, it has been completely and totally wrong. There's probably some selection bias going on in when I choose to try looking at it again, but still notable that the frequency is that high.

Today I searched something and almost pasted the output into an internet forum discussion I was having. But I decided to check the wikipedia source just to make sure. The AI summary was not quoted directly from wikipedia, and it got some major aspects wrong in its summary. Lesson learned.

It is simply summarizing the top few search results. If they are false then the summary will be false.

  • That's not always the cause of the wrong info. I've had a few situations where I was asking pretty specific questions that absolutely have publicly documented authoritative answers, and the first several search results were either the authoritative sources themselves or things that reference the authoritative sources.

    The AI answer got the actual questions completely wrong. The questions involved vehicle registration laws in a specific state. The questions included the name of the state. The AI's answer seemed to be giving information based on other states.

    All of the first page of search results were specific to the state I asked for, so if it was just summarizing those you wouldn't think it would give answers that would only exist on entirely different pages.