Comment by SilverRed
4 years ago
Since this is using a db of known images. I doubt that would be an issue. I believe the idea here is that once police raid an illegal site, they collect all of the images in a db and then want to know a list of every person who had these images saved.
But it said they use a "perceptual hash" - so it's not just looking for 1:1, byte-for-byte copies of specific photos, it's doing some kind of fuzzy matching.
This has me pretty worried - once someone has been tarred with this particular brush, it sticks.
You can’t do a byte-for-byte hash on images because a slight resize or minor edit will dramatically change the hash, without really modifying the image in a meaningful way.
But image hashes are “perceptual” in the sense that the hash changes proportionally with the image. This is how reverse image searching works, and why it works so well.
Sure, I get how it works, but I feel like false positives are inevitable with this approach. That wouldn't necessarily be an issue under normal police circumstances where they have a warrant and a real person reviews things, but it feels really dangerous here. As I mentioned, any accusations along these lines have a habit of sticking, regardless of reality - indeed, irrational FUD around the Big Three (terrorism, paedophilia and organised crime) is the only reason Apple are getting a pass for this.
1 reply →