Comment by Hizonner
6 hours ago
> There's a legally challengable assertion there; "trained on CSAM images".
"Legally challengable" only in a pretty tenuous sense that's unlikely to ever haven any actual impact.
That'll be something that's recited as a legislative finding. It's not an element of the offense; nobody has to prove that "on this specific occasion the model was trained in this or that way".
It could theoretically have some impact on a challenge to the constitutionality of the law... but only under pretty unlikely circumstances. First you'd have to get past the presumption that the legislature can make any law it likes regardless of whether it's right about the facts (which, in the US, probably means you have to get courts to take the law under strict scrutiny, which they hate to do). Then you have to prove that that factual claim was actually a critical reason for passing the law, and not just a random aside. Then you have to prove that it's actually false, overcoming a presumption that the legislature properly studied the issue. Then maybe it matters.
I may have the exact structure of that a bit wrong, but that's the flavor of how these things play out.
My comment was in response to a portion of the comment above:
> because the machine-learning models utilized by AI have been trained on datasets containing thousands of depictions of known CSAM victims
I'd argue that CSAM imagery falls into two broad categories; actual real photographic image of actual real abuse and generated images (paintings, drawings, animations, etc) and all generated images are more or less equally bad.
There's a peer link in this larger thread ( https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn... ) that indicates at least two US citizen have been charged and sentenced for 20 and 40 years imprisonment each for the possession and distribution of "fictional" child abuse (animated and still japanese cartoon anime, etc).
So, in the wider world, it's a moot point whether these specific images came from training on actual abuse images or not, they depict abuse and that's legally sufficient in the US (apparently), further the same depictions could be generated with or without actual real abuse images and as equivilant images either way they'd be equally offensive.