Comment by EagnaIonat
1 month ago
> [citation needed]
It is actually detailed in Apples paper. Also:
https://www.interpol.int/en/Crimes/Crimes-against-children/I...
It works by generating a hash on known materials. Those hashes are shared with other companies so they can find that material without having to see the horrific stuff. The chance of a hash collision was also detailed in the paper which is so low to be non-existent. Even if a clash occurs a human still reviews the materials, and it normally needs a couple of hits to trigger an audit (again according to apples paper on it).
> I know someone whose MS account was permabanned because they had photos of their own kid in the bathtub
So you ask me for a citation and then give me anecdotal evidence?
Even if that happened it has nothing to do with CSAM.
No comments yet
Contribute on Hacker News ↗