← Back to context

Comment by akavi

7 days ago

You are aware that insofar as AI chat apps are "hallucinatory text generator(s)", then so is Google Translate, right?

(while AFAICT Google hasn't explicitly said so, it's almost certainly also powered by an autoregressive transformer model, just like ChatGPT)

> it's almost certainly also powered by an autoregressive transformer model, just like ChatGPT

The objective of that model, however, is quite different to that of an LLM.

I have seen Google Translate hallucinate exactly zero times over thousands of queries over the years. Meanwhile, LLMs emit garbage roughly 1/3 of the time, in my experience. Can you provide an example of Translate hallucinating something?

  • Agreed, and I use G translate daily to handle living in a country where 95% of the population doesn’t speak any language I do.

    It occasionally messes up, but not by hallucinating, usually grammar salad because what I put into it was somewhat ambiguous. It’s also terrible with genders in Romance languages, but then that is a nightmare for humans too.

    Palmada palmada bot.

Google Translate hasn't moved to LLM-style translation yet, unfortunately