Comment by karel-3d
11 hours ago
It will not go wrong in obvious ways, LLMs are actually not that bad for language translation, and they have big test coverage; any issues will be non obvious. The question will be more long term maintainability, how fast will the whole thing collapse.
No comments yet
Contribute on Hacker News ↗