Comment by WesleyJohnson
4 days ago
LLMs hallucinate and often provide incorrect answers. They're a fabulous tool if you're not necessarily looking a specific, correct, answer. But I'm not sure I would want my kids to use them as a tutor, without someone to vet the output.
That's a very good concern to have. Grounding[0] helps a lot with this and will continue to improve. I'll also add that I've had human teachers who were confidently wrong about things.
[0] https://deepmind.google/discover/blog/facts-grounding-a-new-...
Nice, thanks for sharing. I wasn't aware of this, but certainly anticipated we'd continue to see improvement in this area. And I completely agree on teachers being wrong, usually without realizing it, but not always. :)