Comment by ComputerGuru
4 years ago
You’re right, they’re not comparable: Apple was going to be scanning for CSAM on your device, without the photos even reaching their servers (by uploading a backup, texting a copy, etc). Google, so far as I can tell and as corroborated by the article, doesn’t do that. Apple and Google both already scan anything uploaded to your Photos account for CSAM and report to authorities.
No,Apple's proposal was for only scanning stuff that was being uploaded to their servers. And only matching hashes with known material, not general "nude child!" detection.
While there's problems with that too, it is a better design than what Google does.
https://appleinsider.com/articles/21/12/15/apples-hold-on-im...
As the article states, the scanning only occurred if you had iCloud Photos enabled. From what I recall, it worked like:
1. Use on device ML to scan for child porn before iCloud Photos upload.
2. If not found, issue a cryptographic ticket for iCloud upload and include in upload request.
…the whole point of Apple’s scheme as far as I know was to keep as much of the processing on device as possible while also keeping illegal content off of their cloud servers.