CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
> No abuse of a real minor is needed.
Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."
Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.
Are you implying that it's not abuse to "undress" a child using AI?
You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.
Musk's social media platform has recently been subject to intense scrutiny over sexualised images generated and edited on the site using its AI tool Grok.
It's also a massive problem on Meta. Hopefully this action isn't just a one-off.
I am not a fan of Grok, but there has been zero evidence of it creating CSAM. For why, see https://www.iwf.org.uk/about-us/
CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
No abuse of a real minor is needed.
As good as Australia's little boobie laws.
1 reply →
[flagged]
2 replies →
> CSAM does not have a universal definition.
Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning.
> In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response.
No corroboration found on web. Quite the contrary, in fact:
"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"
https://rm.coe.int/factsheet-sweden-the-protection-of-childr...
> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
> No abuse of a real minor is needed.
Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."
Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.
17 replies →
Are you implying that it's not abuse to "undress" a child using AI?
You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.
> Are you implying that it's not abuse to "undress" a child using AI?
Not at all. I am saying just it is not CSAM.
> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.
Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.
3 replies →
It doesn't mention grok?
Sure does. Twice. E.g.
Musk's social media platform has recently been subject to intense scrutiny over sexualised images generated and edited on the site using its AI tool Grok.
3 replies →