Comment by lenkite
9 days ago
As long as the LLM doesn't hallucinate stuff when translating, by generating text that is inaccurate or even completely fabricated.
9 days ago
As long as the LLM doesn't hallucinate stuff when translating, by generating text that is inaccurate or even completely fabricated.
No comments yet
Contribute on Hacker News ↗