Comment by wongarsu
5 hours ago
If you crawl any sufficiently large public collection of images you are bound to download some CSAM images by accident.
Filtering out any images of beaten up naked 7 year olds is certainly something you should do. But if you go by the US legal definition of "any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age)" you are going to have a really hard time filtering all of that automatically. People don't suddenly look differently when they turn 18, and "sexually explicit" is a wide net open to interpretation.
No comments yet
Contribute on Hacker News ↗