Comment by micromacrofoot

2 years ago

Usually it hallucinates when you're asking for information, in this case it's rewriting existing text, so it should be a little safer. When in doubt check another source, as with everything.

People should doubt everything an LLM outputs, ergo, why use it in the first place if the desired output is objective fact? LLMs hallucinate, that's what they do. When it's wrong, you likely won't notice that it's wrong, but over time, your world view is going to become more and more distorted.

  • > why use it in the first place if the desired output is objective fact

    rewriting facts is like 90% of all writing jobs