Comment by detaro

4 years ago

No,Apple's proposal was for only scanning stuff that was being uploaded to their servers. And only matching hashes with known material, not general "nude child!" detection.

While there's problems with that too, it is a better design than what Google does.

https://appleinsider.com/articles/21/12/15/apples-hold-on-im...

  • As the article states, the scanning only occurred if you had iCloud Photos enabled. From what I recall, it worked like:

    1. Use on device ML to scan for child porn before iCloud Photos upload.

    2. If not found, issue a cryptographic ticket for iCloud upload and include in upload request.

    …the whole point of Apple’s scheme as far as I know was to keep as much of the processing on device as possible while also keeping illegal content off of their cloud servers.