Comment by wongarsu
15 hours ago
If you crawl any sufficiently large public collection of images you are bound to download some CSAM images by accident.
Filtering out any images of beaten up naked 7 year olds is certainly something you should do. But if you go by the US legal definition of "any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age)" you are going to have a really hard time filtering all of that automatically. People don't suddenly look differently when they turn 18, and "sexually explicit" is a wide net open to interpretation.
"It's hard" isn't a sufficient reason. Supply chain and infrastructure that makes safe food widely available is hard. We do it anyway because it's the right thing to do.