Comment by CamelCaseCondo
2 days ago
Same experience here. I have fond memories of “google code”, a search engine for code databases which was exceptionally good for finding literal quotes.
The more mainstream a subject is, the lower the incidence of hallucinations. With google search, the mantra “I can’t be the first with this problem/question” almost always proves to be right.
I’m in the process of restoring a piece of vintage electronics and everytime I ask gemini (fast or thinking) for help I’m getting sent down an irrelevant rabbit hole. It’s taking info from service manuals of other equipment with a similar product number, misinterpreting diagrams, getting electrical workings wrong.
These things aren’t AI. AI can extract certainty from uncertain data. LLMs take data and turn it into garbage.
No comments yet
Contribute on Hacker News ↗