Comment by nnurmanov

17 days ago

If you see above, someone is using a second and even third LLM to correct LLM outputs, I think it is the way to minimize hallucinations.

> I think it is the way to minimize hallucinations

Or maybe the way to add new hallucinations. Nobody really knows. Just trust us bro, this is groundbreaking disruptive technology.