← Back to context

Comment by mrtksn

15 hours ago

CSAM: Child Sexual Abuse Material.

When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.

> When you undress a child with AI,

I guess you mean pasting a naked body on a photo of a child.

> especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated.

In which country is that?

Here in UK, I've never heard of anyone jailed for doing that. Whereas many are for making actual child sexual abuse material.