A knife's function is deterministic. LLMs are not.
They routinely misinterpret the information they've ingested and confidently spit out incorrect statements. Worse - they confidently spit out incorrect statements in ways we cannot anticipate.
This isn't comparable to a person. This isn't comparable to human intelligence. This isn't a problem that can be handwaved away by saying "people are sometimes wrong too!"
Inaccurate info exists everywhere. StackOverflow contains inaccurate, outdated, incomplete info. Caveat Emptor wherever you are.
LLMs are like a knife. It is a tool that can hurt you if you misuse it, but it also has the capability to save LOTS to time if you use it well.
A knife's function is deterministic. LLMs are not.
They routinely misinterpret the information they've ingested and confidently spit out incorrect statements. Worse - they confidently spit out incorrect statements in ways we cannot anticipate.
This isn't comparable to a person. This isn't comparable to human intelligence. This isn't a problem that can be handwaved away by saying "people are sometimes wrong too!"