Comment by onetimeusename
4 years ago
I agree, I would add that people have generated legal images that match the hashes.
So I want to ask what happens if you have a photo that is falsely identified as one in question and then an automated mechanism flags you and reports you to the FBI without you even knowing. Can they access your phone at that point to investigate? Would they come to your office and ask about it? Would that be enough evidence to request a wiretap or warrant? Would they alert your neighbors? How do you clear your name after that happens?
edits: yes, the hash database is downloaded to the phone and matches are checked on your phone.
Another point is that these photos used to generate the fingerprints are really legal black holes that the public is not allowed to inspect I assume. No one wants to be involved in looking at them, no one wants to be known as someone who looks at them. It could even be legally dangerous requesting to find out what has been put into the image database I assume.
>I would add that people have generated legal images that match the hashes.
That seems like a realistic attack. Since the hash list is public (has to be for client side scanning), you could likely set your computer to grind out a matching image hash but of some meme which you then distribute.
The NCMEC hash list is private, and adversarial attacks require running gradient descent and being able to generate a hash value for arbitrary input.
At least one of these two things must be true: either Apple is going to upload hashes of every image on your device to someone else's server, or the database of hashes will be available somehow to your device.
1 reply →
Is it possible to narrow in on a hash using gradient descent? You can correlate distance between inputs to distance between hashes somehow?
1 reply →
Might be hard if they use a huge hash.
One thing to note is these are not typical cryptographic hashes because they have to be able to find recompressed/cropped/edited versions as well. Perhaps a hash is not an accurate way to describe it.
There have been a number of cases where people have found ways to trick CV programs in to seeing something that no human would ever see. If you were sufficiently malicious I imagine it would be possible to do with this system as well.