← Back to context

Comment by chris12321

3 months ago

ChatGPT won't tell you how to do anything illegal, for example, it won't tell you how to make drugs.

Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy.

  • > I’d damn sure not bet my life on it not hallucinating.

    One would think that if you asked it to help you make drugs you'd want hallucination as an outcome.

    • Very funny.

      But no. Only a very, very small percentage of drug users want hallucinations.

      Hallucinations happen usually, when something went bad.

      (So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)