← Back to context

Comment by cubefox

2 hours ago

> Also, X seem to disagree with you and admit that CSAM was being generated

That post doesn't contain such an admission, it instead talks about forbidden prompting.

> Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written:

That article links to this article: https://x.com/Safety/status/2011573102485127562 - which contradicts your claim that there were no guardrails before. And as I said, I already tried it a while ago, and Grok also refused to create images of naked adults then.