← Back to context

Comment by srj

4 years ago

I don't have knowledge of how Apple is using this, but based on what I know about how it's used at Google this would be flagging previously reviewed images. That wouldn't include your family photos, but are generally hash-type matches of images circulating online. The images would need to depict actual abuse of a child to be CSAM.