Comment by headmelted
4 years ago
Exactly this.
If this was a hash then it would be as the parent describes, this is at best a very fuzzy match on an image to take into account blurring/flipping/colour shifting.
It's vastly more likely that innocent people will be implicated for fuzzy matches on innocuous photos of their own children in shorts/swimming clothes than it is to catch abusers.
The other thing is, when you have nothing to hide you won't take efforts to hide it - meaning you'll upload all of your (completely normal) photos to iCloud without thinking about it again.
The monsters making these images know what they're doing is wrong, so they'll likely take efforts to scramble or further encrypt the data before uploading.
tldr; it's far likelier that this dragnet will only even apply to innocent people, than it is to catch predators.
All this said, I'm still in support of Apple taking steps in this direction, but it needs far more protections put in place to prevent false positives than this solution allows. A single false accusation by this system, even if retracted later and rectified, would destroy an entire family's lives (and could well cause suicides).
Look what happened in the Post Office case in the UK as an example of how these things can go wrong - scores of people went to prison for years for crimes they didn't commit because of a simple software bug.
> The monsters making these images know what they're doing is wrong, so they'll likely take efforts to scramble or further encrypt the data before uploading.
The ones that make national news from big busts do, because the ones that don't get caught much sooner and only make local news, because Google and other parties are have automatic CSAM identification online already (server side, not client side, AFAIK), and are sending hits to Homeland Security.