← Back to context

Comment by 0xy

4 years ago

Imagine things like:

- A kitchen with nobody in frame.

- A couch with nobody in frame.

- Outdoor scenery with nobody in frame.

- A bathroom with nobody in frame.

Is it hard to believe you wouldn't download something like this without knowing where it came from?

I'm not talking about borderline stuff. I'm talking about content that has not even a hint of pornography or illegality.

But why am I downloading such things to begin with? Not only do these sound like very boring photos, given their providence I don’t understand this realistic pathway to get onto my phone.

So why is it in there and why do they care?

  • Law enforcement care because they want to get pinged whenever someone shares imagery of interest to an investigation, irrespective of its legality.

    A person who creates CSAM likely doesn't just create CSAM all the time, right? Those innocuous pictures get lumped together with illegal content and make it into the database.

    The database is a mess, basically. Of course it is. It's gigantic beyond your wildest estimates.

  • Maybe cropped photos, or innocent frames from videos containing abuse? Not sure what GP is referring to.

    • I am reminded of an infamous couch and pool that are notorious for appearing in many adult productions... Possibly stock footage of a production room or repeating prop intended for being subsampled for so that multiple or repeated works by the same person or group can be flagged. I recall a person of interest was arrested after of all things posting a completely benign YouTube tutorial video. My thought at the time was likely a prop match to the environment or some such within the video. The method is definitely doable. Partitioned out to every consumer device with unflinching acceptance? Yeahhhh.

      Remember, these databases are essentially signature DB's, and there is no guarantee that all hashes are just doing a naive match on the entire file, or that all scans performedare fundamentally the same.

      This is why I reject outright the legitimacy of any Client-based CSAM scanners. In a closed source environment, it's yet another blob, therefore an arbitrary code execution vector.

      I'm sorry, but in my calculus, I'm not willing to buy into that, even for CSAM. It won't stay just filesystems. It won't stay just hash matching. The fact there's so much secrecy around ways and means implies there's likely dynamicity in what they are looking for, and with the permissions and sensors on a phone that many apps already ask for, my not one inch instincts are sadly firmly engaged with no signs of letting up.

      I'm totally behind the fight. I'm not an idiot though, and I know what the road to hell is paved with. Law Enforcement and anti-CSAM agencies are cut a lot of slack, and enjoy a lot of unquestioning acceptance by the populace. In my book, this warrants more scrutiny, and caution not less. The rash of inconvenient people being rather frequently called out as having CSAM found on hard drives in media with no additional context indicates the CSAM definition is being wielded in a manner that produces a great degree of political convenience.

      Again, more scrutiny, not less.

      1 reply →