Comment by tkgally
6 days ago
Around the time GPT-4 was released in early 2023, a similar issue arose with another profession: translation. It was at that point that machine translation between languages like English and Japanese (the language pair I have worked with) started to approach human level for the first time.
I took part in a lot of discussions then with other professional translators, and the reaction of many was similar to that of some of the commenters here: not only were they discouraged because their hard-earned language and translation skills no longer seemed needed, but using LLMs as assistants took the enjoyable challenge out of the translation process.
Nearly everyone I spoke with then worked from home as a freelancer, carefully crafting one translation at a time. They didn’t like the idea of becoming managers of large-scale translation projects, even if it meant they would be able to apply their higher-order intercultural communication skills.
I do only a little professional translation myself now, but I try to keep up with AI developments and I often use translation tasks to test the latest models and frameworks. Over the past few months, I have vibe-coded some multi-LLM translation systems where texts were passed through multiple models that checked, critiqued, and improved each other’s translations. For the texts I tried them on, the results were much better than any single-LLM translation, approaching the level of the very best human translation. The API calls weren’t cheap, but for high-stakes translations such a system would more than pay for itself.
When designing that “vibe translation” system, I did apply my experience as a translator, similarly to what Simon is recommending programmers do now with vibe engineering. At this stage in my life (I’m sixty-eight), I am fine with that. But if LLMs had arrived when I was, say, just five or ten years into my translation career and still proud of my nuts-and-bolts skills, I might very well have looked for another career rather than becoming a vibe translator.
Translation is great to discuss LLMs. Thanks for sharing your experience.
On one side translation is not very valued by most people. It is rare that people know who translated a book, for example. It is a pity but people do not read much these days.
Additionally, or maybe because of the above, translation is often paid in terms of number of lines or words, even before AI. A bit like software security, it is often sadly just a check at the end.
IMHO the only future-proof translation fields are legal (because a human can be put in prison or pay a fine) or live translation/interpretation (because a human can go in front of people, meet them at an event, etc.).
> It is a pity but people do not read much these days.
This is a common belief, but it's just not true. The book industry is healthier than ever.
There have been study results published quite recently that suggested otherwise, at least for the U.S.
While I don't recall the actual numbers, the number of people who said they read books pretty much crashed over the past two decades, and more parts of the population are considered to be functionally illiterate than in a long time.
I spoke with some professional translators early on and they were just in denial, getting even upset at the idea that an AI could replace them. I didn't push too much but felt bad for them, as they couldn't realize what was going to happen to their field. I really think that translation must be the most impacted field by AI.
Software engineers are the translators. We (as a metaphorical community are in denial). Read the 100s of comments on this post: either (AI code is wrong or this isn’t correct or it’s not effective or some other justification). At the end of the day it’s really hard to accept the change.
> Software engineers are the translators.
True, recently I started feeling that part of what I had been doing is simply translating natural language to programming languages. Though coding also involves things such as algorithm and data structures, context and background knowledge, but these can all be done in natural language. Once the natural language description is given, what remains is only translation. LLMs have good knowledge of almost anything, though they are currently weak on inference or derivation, but this already makes them good on the two end of software engineering -- context knowledge and translation.
I understand those people who hate LLMs for coding, I partly share the feeling, because I enjoy typing on a keyboard, editing part of the code, reading the characters, I am an Emacs user. If LLMs can do the work, even if just save the typing and editing part, some of the fun has been eliminated for me.
Think about chess and Go, though AI can easily beats human now, people are still playing it. For programming, if one day AI can do 80% of the programming work, I guess only few of programmers today can keep doing it as a job. Just like few people can play chess and Go as their job.
2 replies →
This is a really great comparison to draw. This actually made me think that this feeling of going from mastering a craft to working on large scale systems is probably how someone who was passionate about cars felt when they went from building cars one by one, knowing how the whole machine works to then having to take a job on an assembly line.
Fortunately I think anything pertaining to vibe coding/engineering/analytics is still more enjoyable and less grim than working on an assembly line, but the human feelings remain nonetheless.