← Back to context

Comment by mrshadowgoose

1 month ago

> What the feature Apple wanted to add was scan the files on the device and flag anything that gets a match.

This is not the revelation you think it is. Critics understood this perfectly.

People simply did not want their devices scanning their content against some opaque uninspectable government-controlled list that might send you to jail in the case of a match.

More generally, people usually want their devices working for their personal interests only, and not some opaque government purpose.

From my understanding, it didn't scan all of the files on the device, just the files that were getting uploaded to Apple's iCloud. It was set up to scan the photos on the device because the files were encrypted before they were sent to the cloud and Apple couldn't access the contents but still wanted to try to make sure that their cloud wasn't storing anything that matched various hashes for bad content.

If you never uploaded those files to the cloud, the scanning wouldn't catch any files that are only local.

  • Your understanding is correct, as was/is the understanding of people critical of the feature.

    People simply don't want their device's default state to be "silently working against you, unless you are hyperaware of everything that needs to be disabled". Attacks on this desire were felt particularly strongly due to Apple having no legal requirement to implement that functionality.

    One also can't make the moral argument that the "bad content" list only included CSAM material, as that list was deliberately made opaque. It was a "just trust me bro" situation.

    • > People simply don't want their device's default state to be "silently working against you

      That was the misconception of what was happening though.

      Nothing happens on your device. Only when it gets to the cloud. It just puts a flag on the picture in question to have the cloud scan it.

      Which is exactly what happens before Apple suggested it and happens now. Except it does it for all your files.

      > One also can't make the moral argument that the "bad content" list only included CSAM material, as that list was deliberately made opaque. It was a "just trust me bro" situation.

      CSAM database is run by Interpol. What evidence do you have that they are not being honest?

      3 replies →

> People simply did not want their devices scanning their content against some opaque uninspectable government-controlled list that might send you to jail in the case of a match.

Again I feel like many people just didn't read/understand the paper.

As it stands now all your files/videos are scanned on all major Cloud companies.

Even if you get a hit on the database the hash doesn't put you in jail. The illegal materials do and a human reviews that before making a case.