← Back to context

Comment by darquomiahw

3 months ago

AI-generated "CSAM" is the perfect form of kompromat. Any computer with a GPU can generate images that an American judge can find unpalatable. Once tarred by the sex offender brush and cooled by 20+ years in prison for possession, any individual will be effectively destroyed. Of course, no real children are being abused, which makes it all the more ludicrous.

You could say the same thing about real CSAM. Authorities claim you have it, present a hard drive they "recovered" from your property to court and you get convicted.

At some level we just have to hope, and be skeptical of etc, that the government is basically staffed by mostly trustworthy people.

  • >> Of course, no real children are being abused,

    > You could say the same thing about real CSAM.

    Maybe this is trying to make a technical point.

    But abuse is a brutally heavy topic, mere technicalities aren't ever going to be strong enough to support the weight put upon them.

    That said: I do believe there are counter arguments to be made. I question the ethics and actions of people who leverage the pain of injured children to expand gov reach and power.

    Having been a child who can check all the abuse boxes, it gets old watching the endless parade of interests try to mine that pain for their own ends.

    • In the scenario I laid out, no additional real children are being abused either - because the government is simply framing you and you never consumed or produced any CSAM. That's what I meant by it being similar - if your concern with punishing people for AI generated CSAM is that law enforcement could lie about it and frame innocent people, then what I'm trying to point out is, that same risk exists with the real thing, they could lie about that too - if you want the government to enforce any laws you need to trust they're mostly trustworthy.