Comment by aethertron
4 years ago
Maybe cropped photos, or innocent frames from videos containing abuse? Not sure what GP is referring to.
4 years ago
Maybe cropped photos, or innocent frames from videos containing abuse? Not sure what GP is referring to.
I am reminded of an infamous couch and pool that are notorious for appearing in many adult productions... Possibly stock footage of a production room or repeating prop intended for being subsampled for so that multiple or repeated works by the same person or group can be flagged. I recall a person of interest was arrested after of all things posting a completely benign YouTube tutorial video. My thought at the time was likely a prop match to the environment or some such within the video. The method is definitely doable. Partitioned out to every consumer device with unflinching acceptance? Yeahhhh.
Remember, these databases are essentially signature DB's, and there is no guarantee that all hashes are just doing a naive match on the entire file, or that all scans performedare fundamentally the same.
This is why I reject outright the legitimacy of any Client-based CSAM scanners. In a closed source environment, it's yet another blob, therefore an arbitrary code execution vector.
I'm sorry, but in my calculus, I'm not willing to buy into that, even for CSAM. It won't stay just filesystems. It won't stay just hash matching. The fact there's so much secrecy around ways and means implies there's likely dynamicity in what they are looking for, and with the permissions and sensors on a phone that many apps already ask for, my not one inch instincts are sadly firmly engaged with no signs of letting up.
I'm totally behind the fight. I'm not an idiot though, and I know what the road to hell is paved with. Law Enforcement and anti-CSAM agencies are cut a lot of slack, and enjoy a lot of unquestioning acceptance by the populace. In my book, this warrants more scrutiny, and caution not less. The rash of inconvenient people being rather frequently called out as having CSAM found on hard drives in media with no additional context indicates the CSAM definition is being wielded in a manner that produces a great degree of political convenience.
Again, more scrutiny, not less.
Who knows what that code blob is really doing? It's cop spyware. Sometimes cops plant evidence. Maybe courts shouldn't trust it.
In principle, the OS environment could be made independently auditable - keeping undeleteable signed logs.