Comment by teeth-gnasher

3 months ago

Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy.

> I’d damn sure not bet my life on it not hallucinating.

One would think that if you asked it to help you make drugs you'd want hallucination as an outcome.

  • Very funny.

    But no. Only a very, very small percentage of drug users want hallucinations.

    Hallucinations happen usually, when something went bad.

    (So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)