← Back to context

Comment by logicchains

18 hours ago

The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

If libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children.

  • I'm strongly against CSAM but I will say this analogy doesn't quite hold (though the values behind it does)

    Libel must be as assertion that is not true. Photoshopping or AIing someone isn't an assertion of something untrue. It's more the equivalent of saying "What if this is true?" which is perfectly legal

    • “ 298 (1) A defamatory libel is matter published, without lawful justification or excuse, that is likely to injure the reputation of any person by exposing him to hatred, contempt or ridicule, or that is designed to insult the person of or concerning whom it is published.

          Marginal note:Mode of expression
      
          (2) A defamatory libel may be expressed directly or by insinuation or irony
      
              (a) in words legibly marked on any substance; or
      
              (b) by any object signifying a defamatory libel otherwise than by words.”
      

      It doesn't have to be an assertion, or even a written statement.

      2 replies →

> The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.

Quite.

> That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

Really? By what US definition of CSAM?

https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

"Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "

That's not what we are discussing here. Even less when a lot of the material here is edits of real pictures.