← Back to context

Comment by jchw

3 days ago

Tangentially: LLMs are really impressive at translation. I guess it shouldn't come as that much of a surprise given where a lot of the most pivotal research came from, but still, the leading edge LLMs are extremely good for situations where having a human translator is infeasible or too expensive, and if you're worried about correctness you can go through and verify the translation using reference material and asking the LLM for more information about a given excerpt, which you can also verify against references and online discussions.

I think my only concern is that I'm not sure how to make sure I'll always have an untainted set of reference material to check against in the post-LLM Internet. We've had LLM hallucinations result in software features. Are we possibly headed towards a world where LLM hallucinations occasionally reshape language and slang?

I feel bad for human translators right now. For various use cases, current-day machine translations and especially LLM translations are sufficient. For those not versed in the world of otaku and video game nerds, one extremely fascinating development of the last few years is the one-shot commission platform Skeb, where people can send various kinds of art commission requests to Japanese artists. They integrate with DeepL to support requests from people who don't speak Japanese fluently, and it seems to generally work very well. (The lower-stakes nature of one-shot art commissions helps a bit here too, but at the least I think communication issues are rarely a huge problem, which is pretty impressive.) And that kicked off before LLMs started to push machine translations even further.

My wife has been working on translation recently, and the LLM hit rate for novels is highly variable. It's capable of just dropping out entire paragraphs. You still need a final pass from a human native speaker editor to check that it makes sense. Which is what's happening in this article, the news site cares enough about their brand and quality to check the output.

I agree that bidirectional communication is probably going to work a lot better, because people are more likely to be alert to the possibility of translation issues and can confirm understanding interactively.

  • Definitely contextually dependent: things like news articles are probably the gold standard for machine translation, but creative works and particularly dense reading material like novels seem like they will remain a tougher nut to crack for even LLMs. It's not hopeless, but it's definitely way too early for anyone to be firing all of their translators.

    > It's capable of just dropping out entire paragraphs.

    I suspect, though, that issues like this can be fixed by improving how we interface with the LLM for the purposes of translation. (Closed-loop systems that use full LLMs under the hood but output a translation directly as-if they are just translation models probably already have solved this kind of problem by structuring the prompt carefully and possibly incrementally.)

They are. For unlicensed fan translations of indie Japanese games the word "MTL" used to mean "unplayable translation quality" until maybe a year or two ago. Now ChatGPT can maintain enough context to translate the game mostly correctly. There are still cases when the names flip-flop between two plausible translations (e.g. Rina-Lina or "scroll of wisdom"-"sage scroll") or the gender is not inferred correctly, but they are rare enough that a single editor can crowdsource and apply the fixes. The prose itself is now finally legible and you don't feel like you're reading the clues to a cryptic crossword.

I was watching a graphic novel on YouTube yesterday that was translated from Japanese text into an English narration. It was weird, not perfect, probably copyright infringement, but pretty effective. I think it’s only a matter of time until we have real time local translator hardware that we can just plug in our ear when traveling, or heck, working in another country where you don’t speak the local language. Language barriers are going to fall quickly.

  • The auto-dubbing on youtube has the classic hallmark of an AI product: you can't turn it off easily.

    (the audio track switcher, which will give you back the original audio, is not available on mobile. Fortunately if you use newpipe it is ..)

    > real time local translator hardware that we can just plug in our ear when traveling

    These are definitely already a thing, popular in Asian countries. https://www.aliexpress.com/item/1005008777097933.html

  • It's no doubt possible to have translation with less lag on average than current tools like Google Translate's Conversation Mode, but truly real-time translation is impossible because of word order differences. You can't translate a word you haven't heard yet.

    • Reminds me of a letter to the Times regarding German word order (and the fact that the verb often comes last)

      > Sir, Your reader's reference to German word order reminds me of a UN meeting at which I worked when the German delegate ranted for ages while all French eyes turned to the French interpreter booth. The interpreter witheringly interjected "j'attends le verbe".

    • But truly realtime translation is rarely (never?) critical. For streamed media there is usually enough of a buffer to handle word order swaps without delay (although one can make contrived counter-examoles), and for human interactions that delay is likely less than normal human processing time.

  • Have you seen Soniox? They support real-time translation (only speech to text translation for now).

    https://soniox.com/

    (disclaimer: I worked there)

    • You know someone will figure out how to do real time translation and use the same voice as the speaker, or if you have a room with more than speaker, rather than figure out who you are listening to, just do them all and remix according to how the input sources were mixed.

      2 replies →

> LLMs are really impressive at translation.

I love it how one can mix languages in one sentence. "Anyway, re-add Tiefkühl and tell me to just check what they have there as TK-Gemüse, and a note at the end that I should go to EDEKA someday this or next week"