Comment by amatecha
1 month ago
"It only has hashes for the worst of the worst stuff out there." [citation needed]
I know someone whose MS account was permabanned because they had photos of their own kid in the bathtub. I mean, I guess the person could have been lying, but I doubt they would even have been talking about it if the truth was less innocuous.
Sure, and they do that because Microsoft's CSAM detection product (which other providers like Google supposedly use) operates by having unencrypted data access to your files in the cloud.
What Apple wanted to do is do those operations using homomorphic encryption and threshold key release so that the data was checked while still encrypted, and only after having a certain number of high likelihood matches would the possibility exist to see the encrypted data.
So the optimistic perspective was that it was a solid win against the current state of the industry (cloud accounts storing information unencrypted so that CSAM products can analyze data), while the pessimistic perspective was that your phone was now acting as a snitch on your behavior (slippery slope etc.)
> while the pessimistic perspective was that your phone was now acting as a snitch on your behavior
The actual auditing doesn't happen until the file hits the cloud though. Which is what happens now.
Thanks for some voice of reason. I'm still amazed at how many are still upset about this but clearly never actually read the paper on it.
I'm just refuting what the person I responded to said, because apparently these services have hashes for more than just "the worst of the worst stuff" or whatever.
> because apparently these services have hashes for more than just "the worst of the worst stuff" or whatever.
Do you have a citation for that as well? I linked you what CSAM is. Where are you getting your information from?
> [citation needed]
It is actually detailed in Apples paper. Also:
https://www.interpol.int/en/Crimes/Crimes-against-children/I...
It works by generating a hash on known materials. Those hashes are shared with other companies so they can find that material without having to see the horrific stuff. The chance of a hash collision was also detailed in the paper which is so low to be non-existent. Even if a clash occurs a human still reviews the materials, and it normally needs a couple of hits to trigger an audit (again according to apples paper on it).
> I know someone whose MS account was permabanned because they had photos of their own kid in the bathtub
So you ask me for a citation and then give me anecdotal evidence?
Even if that happened it has nothing to do with CSAM.