← Back to context Comment by chris12321 1 year ago ChatGPT won't tell you how to do anything illegal, for example, it won't tell you how to make drugs. 3 comments chris12321 Reply teeth-gnasher 1 year ago Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy. riskable 1 year ago > I’d damn sure not bet my life on it not hallucinating.One would think that if you asked it to help you make drugs you'd want hallucination as an outcome. lukan 1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
teeth-gnasher 1 year ago Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy. riskable 1 year ago > I’d damn sure not bet my life on it not hallucinating.One would think that if you asked it to help you make drugs you'd want hallucination as an outcome. lukan 1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
riskable 1 year ago > I’d damn sure not bet my life on it not hallucinating.One would think that if you asked it to help you make drugs you'd want hallucination as an outcome. lukan 1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
lukan 1 year ago Very funny.But no. Only a very, very small percentage of drug users want hallucinations.Hallucinations happen usually, when something went bad.(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)
Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy.
> I’d damn sure not bet my life on it not hallucinating.
One would think that if you asked it to help you make drugs you'd want hallucination as an outcome.
Very funny.
But no. Only a very, very small percentage of drug users want hallucinations.
Hallucinations happen usually, when something went bad.
(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)