Comment by defrost
10 hours ago
> I think you would be hard-pressed to prove that any model trained on internet scrapes definitively wasn't trained on any CSAM whatsoever.
I'd be hard pressed to prove that you definitely hadn't killed anybody ever.
Legally if it's asserted that these images are criminal because they are the result of being the product of an LLM trained on sources that contained CSAM then the requirement would be to prove that assertion.
With text and speech you could prompt the model to exactly reproduce a Sarah Silverman monologue and assert that proves her content was used in the training set, etc.
Here the defense would ask the prosecution to demonstrate how to extract a copy of original CSAM.
But your point is well taken, it's likely most image generation programs of this nature have been fed at least one image that was borderline jailbait and likely at least one that was well below the line.
Framing it in that way is essentially a get out of jail free card - anyone caught with CSAM can claim it was AI generated by a "clean" model, and how would the prosecution ever be able to prove that it wasn't?
I get where you are coming from but it doesn't seem actionable in any way that doesn't effectively legalize CSAM possession, so I think courts will have no choice but to put the burden of proof on the accused. If you play with fire then you'd better have the receipts.
This seems like a long way of saying “guilty until proven innocent”.
It is, but that's hardly unprecedented for this type of thing. In commercial porn the legal burden is on the producer to prove that their models were adults, not for prosecutors to prove that they were minors.
1 reply →
> Legally if it's asserted that these images are criminal because they are the result of being the product of an LLM trained on sources that contained CSAM then the requirement would be to prove that assertion.
Legally, possession of CSAM is against the law because there is an assumption that possession proves contribution to market demand, with an understanding that demand incentives production of supply, meaning there that with demand children will be harmed again to produce more content to satisfy the demand. In other words, the intent is to stop future harm. This is why people have been prosecuted for things like suggestive cartoons that have no real-life events behind them. It is not illegal on the grounds of past events. The actual abuse is illegal on its own standing.
The provenance of the imagery is irrelevant. What you need to prove is that your desire to have such imagery won't stimulate yourself or others to create new content with real people. If you could somehow prove that LLM content will satisfy all future thirst, problem solved! That would be world changing.
I'm somewhat sympathetic to that argument. However, it doesn't stop there.
Violent video games prove contribution to market demand for FPS-style videos of mass shootings or carjackings, so can/should we ban Call of Duty and Grand Theft Auto now?
(Note that the "market demand" argument is subtly different from the argument that the games directly cause people to become more violent, either in general or by encouraging specific copycat violence. Studies on [lack of] direct violence causation are weak and disputed.)
Tell us more about the market that is carrying out mass shootings/carjackings with intent to sell the imagery of it. This is the first I've heard of it. In fact, when mass shootings do occur they are usually explained away for some other reason (e.g. a troubled teen who went off the rails), not that it is the day-to-day operation of a videography business.
1 reply →