Comment by EagnaIonat

1 month ago

That whole incident was so misinformed.

CSAM scanning takes place on the cloud with all the major players. It only has hashes for the worst of the worst stuff out there.

What Apple (and others do) is allow the file to be scanned unencrypted on the server.

What the feature Apple wanted to add was scan the files on the device and flag anything that gets a match.

That file in question would be able to be decrypted on the server and checked by a human. For everything else it was encrypted in a way it cannot be looked at.

If you had icloud disabled it could do nothing.

The intent was to protect data, children and reduce the amount of processing done on the server end to analyse everything.

Everyone lost their mind yet it was clearly laid out in the papers Apple released on it.

Apple sells their products in oppressive regimes which force them to implement region specific features. E.g. China has their own iCloud, presumeably so it can be easily snooped on.

If they were to add this anti-CSAM feature, it is not unreasonable to think that Apple would be forced to add non-CSAM stuff to the database in these countries, e.g. anything against a local dictatorship/ etc. Adding the feature would only catch the low hanging CSAM fruit, at the cost of great privacy and probably human life. If it was going to stop CSAM once and for all, it could possibly be justified, but that's not the case.

  • If China can force Apple to do that stuff, then it can do that regardless of whether or not they add this feature.

  • Apple and others already scan peoples pictures/videos for this stuff, so your argument can be applied to what it is now.

    Apples suggestion would have meant your data would be more protected as even they would not have been able to unencrypt your data.

"It only has hashes for the worst of the worst stuff out there." [citation needed]

I know someone whose MS account was permabanned because they had photos of their own kid in the bathtub. I mean, I guess the person could have been lying, but I doubt they would even have been talking about it if the truth was less innocuous.

  • Sure, and they do that because Microsoft's CSAM detection product (which other providers like Google supposedly use) operates by having unencrypted data access to your files in the cloud.

    What Apple wanted to do is do those operations using homomorphic encryption and threshold key release so that the data was checked while still encrypted, and only after having a certain number of high likelihood matches would the possibility exist to see the encrypted data.

    So the optimistic perspective was that it was a solid win against the current state of the industry (cloud accounts storing information unencrypted so that CSAM products can analyze data), while the pessimistic perspective was that your phone was now acting as a snitch on your behavior (slippery slope etc.)

    • > while the pessimistic perspective was that your phone was now acting as a snitch on your behavior

      The actual auditing doesn't happen until the file hits the cloud though. Which is what happens now.

      Thanks for some voice of reason. I'm still amazed at how many are still upset about this but clearly never actually read the paper on it.

    • I'm just refuting what the person I responded to said, because apparently these services have hashes for more than just "the worst of the worst stuff" or whatever.

      1 reply →

  • > [citation needed]

    It is actually detailed in Apples paper. Also:

    https://www.interpol.int/en/Crimes/Crimes-against-children/I...

    It works by generating a hash on known materials. Those hashes are shared with other companies so they can find that material without having to see the horrific stuff. The chance of a hash collision was also detailed in the paper which is so low to be non-existent. Even if a clash occurs a human still reviews the materials, and it normally needs a couple of hits to trigger an audit (again according to apples paper on it).

    > I know someone whose MS account was permabanned because they had photos of their own kid in the bathtub

    So you ask me for a citation and then give me anecdotal evidence?

    Even if that happened it has nothing to do with CSAM.

I can't believe how uninformed, angry, and still willing to argue about it people were over this. The whole point was a very reasonable compromise between a legal requirement to scan photos and keeping photos end-to-end encrypted for the user. You can say the scanning requirement is wrong, there's plenty of arguments for that. But Apple went so above and beyond to try to keep photo content private and provide E2E encryption while still trying to follow the spirit of the law. No other big tech company even bothers, and somehow Apple is the outrage target.

  • > a legal requirement to scan photos

    Can you provide documentation demonstrating this requirement in the United States? It is widely understood that no such requirement exists.

    There's no need to compromise with any requirement, this was entirely voluntary on Apple's part. That's why people were upset.

    > I can't believe how uninformed

    Oh the irony.

    • Should have said "potential legal requirement". There was a persistent threat of blocking the use of E2E encryption for this exact reason.

    • > Can you provide documentation demonstrating this requirement in the United States?

      PROTECT Act of 2003 - Details CSAM materials as being illegal which is enforced by FBI + ICE.

      Also NCMEC which is a non-profit created by the US government that actively works in this area.

  • > a legal requirement to scan photos

    There is absolutely no such legal requirement. If there were one it would constitute an unlawful search.

    The reason the provider scanning is lawful at all is because the provider has inspected material voluntarily handed over to them, and through their own lawful access to the customer material has independently and without the direction of the government discovered what they believe to be unlawful material.

    The cryptographic functionality in Apple's system was not there to protect the user's prviacy, the cryptographic function instead protected apple and their datasources from accountability by concealing the fingerprints that would cause user's private data to be exposed.

  • There isn’t a law that requires them to proactively scan photos. That is why they could turn the feature back off.

    • A law by the government requiring proactive scanning of photos would in fact make the whole situation worse in the US because there would need to be a warrant if the government is requiring the scan. As long as it's voluntary by the company and not coerced by the government, they can proactively scan.

> What the feature Apple wanted to add was scan the files on the device and flag anything that gets a match.

This is not the revelation you think it is. Critics understood this perfectly.

People simply did not want their devices scanning their content against some opaque uninspectable government-controlled list that might send you to jail in the case of a match.

More generally, people usually want their devices working for their personal interests only, and not some opaque government purpose.

  • From my understanding, it didn't scan all of the files on the device, just the files that were getting uploaded to Apple's iCloud. It was set up to scan the photos on the device because the files were encrypted before they were sent to the cloud and Apple couldn't access the contents but still wanted to try to make sure that their cloud wasn't storing anything that matched various hashes for bad content.

    If you never uploaded those files to the cloud, the scanning wouldn't catch any files that are only local.

    • Your understanding is correct, as was/is the understanding of people critical of the feature.

      People simply don't want their device's default state to be "silently working against you, unless you are hyperaware of everything that needs to be disabled". Attacks on this desire were felt particularly strongly due to Apple having no legal requirement to implement that functionality.

      One also can't make the moral argument that the "bad content" list only included CSAM material, as that list was deliberately made opaque. It was a "just trust me bro" situation.

      4 replies →

  • > People simply did not want their devices scanning their content against some opaque uninspectable government-controlled list that might send you to jail in the case of a match.

    Again I feel like many people just didn't read/understand the paper.

    As it stands now all your files/videos are scanned on all major Cloud companies.

    Even if you get a hit on the database the hash doesn't put you in jail. The illegal materials do and a human reviews that before making a case.

That technology of perceptional hashes could have failed in numerous ways, ruining lives of law-abiding users along the way.

  • The chance of a hash colliding is near 0%. The hashes are for some of the worst content out there, its not trying to detect anything else.

    Even so a human is in the loop to review what got a hit. Which is exactly currently happens now.

    • > The chance of a hash colliding is near 0%.

      The 'chance' is 100% -- collisions and even arbitrary second preimages have been constructed.

      > The hashes are for some of the worst content out there, its not trying to detect anything else.

      You don't know that because apple developed powerful new cryptographic techniques to protect themselves and their data providers from accountability.

      2 replies →

    • > The chance of a hash colliding is near 0%

      Until someone finds a successful collision attack.

      > Even so a human is in the loop to review what got a hit.

      Until shareholder/growth pressure causes them to replace that human with an AI.

      2 replies →

Yes this is better than upload the entire photo. Just like virus scan can be done entirely on device, can flagging be local?. If homeomorphic encryption allows similarity matching, does not seem entirely private. Can people be matched?

> The intent was to protect data, children and reduce the amount of processing done on the server end to analyse everything.

If it’s for the children, then giving up our civil liberties is a small price to pay. I’d also like to give up liberties in the name of “terrorism”.

When we willingly give up our rights out of fear, these evil people have won.

  • > If it’s for the children, then giving up our civil liberties is a small price to pay.

    All your pictures and videos are currently scanned. What civil liberty did their approach change in that?