Comment by throwaway290
8 months ago
LLMs are all bad at math. But there are worse ways Google fails.
Like people asked "does Lululemon use <name of some chinese company> to make its products" and Google says "yes", with no source except one tiktok video that falsely claims it to boost sales in face of tariffs. (Ignoring that it's not in the actual supplier list published by lululemon on their site)
Which means basically people would see that tiktok, go to fact check on google if it's true, and google overview will say "yes" (+ paragraphs of text that no one reads) citing that tiktok.
Vicious circle of LLM factchecking. Google used to be immune to it until it started to shove chatbot output to people's faces.
No comments yet
Contribute on Hacker News ↗