Comment by tjmc

4 years ago

Maybe I'm completely paranoid here, but given that actual sex offenders commonly seek out ways to be near children, what happens if one or more of them end up in Apple's image vetting team?

They'd be completely anonymous and fully covered, with an endless pipeline of naked kids images being delivered to them.

The idea that if you take a picture of your kid in the bath, it just happens to match a CSAM fingerprint and then gets silently transmitted to anonymous reviewers for "review" is terrifying.

This is a disgusting thought, but hear me out. Perhaps this might actually be a good job to give to a paedophile. Their classifications would probably have a superior false positive rate than someone who is disgusted by the images, and it would all but eliminate any concern about an employee suffering psychological trauma.

  • > and it would all but eliminate any concern about an employee suffering psychological trauma.

    I doubt this. If they were all images that this person happened to be into, maybe... But even then, I think it would likely make their addiction to child porn worse, which is it's own psychological problem that is probably worse for society than the trauma suffered by current employees. What happens when they leave that job, and are used to seeing hundreds of cp images a day?

    Not to mention, that some of the stuff the scanner would be looking at would probably be horrific, and violent. Looking at that kind of thing all day would probably have similar psychiatric affects on pedophiles and non pedophiles. In the worst case, it might cause some pedophiles to start to like the worse images out of boredom from seeing so much cp.

    Overall, I'd say this is would just be a bad avenue to go down.

Your terrifying idea mischaracterizes the nature of false positives. Any photo in your library is equally liable to be a false positive as any other; the perceptual hash is not looking for similar images by the metric of what you find similar (content). That’s also the underlying idea behind why people have been able to turn arbitrary images into adversarial false positives.

  • So that picture of my driver's license I took for an ID check or that sensitive work document I scanned with my phone are just as likely to be sent? Great.

    • The image would need to be vaguely similar in terms of gross shapes and arrangement. It's exceedingly unlikely that any CSAM would ever be remotely similar to an ID card or a sheet of paper.

      If there are ever going to be any "natural" matches to any CSAM hashes, it's probably going to be a photograph of people who are coincidentally in a similar pose at a nearly identical angle and strikingly shading.

      8 replies →

    • The chance that any pictures from your library are revealed at all is at most one in one trillion (mod you not storing CSAM or being attacked by someone trying to plant evidence on you). Contrast this to a server side scanning system where every photo in your library will be accessed with unknown false positive characteristics.