Comment by moolcool
17 hours ago
Are you implying that it's not abuse to "undress" a child using AI?
You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.
> Are you implying that it's not abuse to "undress" a child using AI?
Not at all. I am saying just it is not CSAM.
> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.
Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.
Why do you want to keep insisting that undressing children is not CSAM? It's a weird hill to die on..
CSAM: Child Sexual Abuse Material.
When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.
> When you undress a child with AI,
I guess you mean pasting a naked body on a photo of a child.
> especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated.
In which country is that?
Here in UK, I've never heard of anyone jailed for doing that. Whereas many are for making actual child sexual abuse material.