Comment by diggan
1 day ago
> LLMs will tell you 1 or 2 lies for each 20 facts. Its a hard way to learn.
That was my experience when growing up with school also, except you got punished one way or another for speaking up/trying to correct the teacher. If I speak up with the LLM they either explain why what they said is true, or corrects themselves, 0 emotions involved.
> They cant even get their urls right...
Famously never happens with humans.
You are ignoring the fact that the types of mistakes or lies are of a different nature.
If you are in class, and you incorrectly argue, there is a mistake in an explanation of Derivatives or Physics, but you are the one in error, your Teacher hopefully, will not say: "Oh, I am sorry you are absolutely correct. Thank you for your advice.."
Yeah, no of course if I'm wrong I don't expect the teacher to agree with me, what kind of argument is that? I thought it was clear, but the base premise of my previous comment is that the teacher is incorrect and refuse corrections...
My point is a teacher will not do something like this:
- Confident synthesis of incompatible sources: LLM: “Einstein won the 1921 Nobel Prize for his theory of relativity, which he presented at the 1915 Solvay Conference.”
Or
- Fabricated but plausible citations: LLM: “According to Smith et al., 2022, Nature Neuroscience, dolphins recognise themselves in mirrors.” There is no such paper...model invents both authors and journal reference
And this is the danger of coding with LLMs....
2 replies →