Comment by lenkite
4 months ago
As long as the LLM doesn't hallucinate stuff when translating, by generating text that is inaccurate or even completely fabricated.
4 months ago
As long as the LLM doesn't hallucinate stuff when translating, by generating text that is inaccurate or even completely fabricated.
No comments yet
Contribute on Hacker News ↗