Comment by Shank
4 years ago
I was under the impression that one of the reasons why these tools aren’t available for public download is because the hashes and system can be used to design defeat mechanisms? Doesn’t this mean that someone who has an image and a jail broken device can just watch the system, identify how the photo is detected, and modify it so that it doesn’t trip the filter?
PhotoDNA and systems like it are really interesting, but it seems like clientside scanning is a dangerous decision, not just from the privacy perspective. It seems like giving a CSAM detector and hashes to people is a really risky idea, even if it’s perfect and it does what it says it does without violating privacy.
I see it as a huge risk too.
If the algorithm and the blocklists leaked, then not only it would be possible to develop tools that reliably modify CSAM to avoid detection, but also generate new innocent-looking images that are caught by the filter. That could be used to overwhelm law enforcement with false positives and also weaponized for SWAT-ing.
Fortunately, it seems that matching is split between client-side and server-side, so extraction of the database from the device will not easily enable generation of matching images.
https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...