Comment by parliament32
7 days ago
I have seen Google Translate hallucinate exactly zero times over thousands of queries over the years. Meanwhile, LLMs emit garbage roughly 1/3 of the time, in my experience. Can you provide an example of Translate hallucinating something?
Agreed, and I use G translate daily to handle living in a country where 95% of the population doesn’t speak any language I do.
It occasionally messes up, but not by hallucinating, usually grammar salad because what I put into it was somewhat ambiguous. It’s also terrible with genders in Romance languages, but then that is a nightmare for humans too.
Palmada palmada bot.
Every single time it mistranslates something it is hallucinations.