Comment by tokai

17 hours ago

In what world is generating CSAM a speech issue? Its really doing a disservice to actual free speech issues to frame it was such.

if pictures are speech, then either CSAM is speech, or you have to justify an exception to the general rule.

CSAM is banned speech.

The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

  • If libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children.

    • I'm strongly against CSAM but I will say this analogy doesn't quite hold (though the values behind it does)

      Libel must be as assertion that is not true. Photoshopping or AIing someone isn't an assertion of something untrue. It's more the equivalent of saying "What if this is true?" which is perfectly legal

      3 replies →

  • > The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.

    Quite.

    > That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

    Really? By what US definition of CSAM?

    https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

    "Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "

  • That's not what we are discussing here. Even less when a lot of the material here is edits of real pictures.