Comment by gkoberger

4 years ago

I don't know... I have a really hard time getting too upset about this. I'm a big proponent of privacy and have always been a Snowden supporter. And while "protecting children" is a trope in politics, I think everyone with an iPhone knows they're giving up some privacy to own one. It's constantly tracking their location and sending other data to Apple.

This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.

I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.

Maybe I'm completely paranoid here, but given that actual sex offenders commonly seek out ways to be near children, what happens if one or more of them end up in Apple's image vetting team?

They'd be completely anonymous and fully covered, with an endless pipeline of naked kids images being delivered to them.

The idea that if you take a picture of your kid in the bath, it just happens to match a CSAM fingerprint and then gets silently transmitted to anonymous reviewers for "review" is terrifying.

  • This is a disgusting thought, but hear me out. Perhaps this might actually be a good job to give to a paedophile. Their classifications would probably have a superior false positive rate than someone who is disgusted by the images, and it would all but eliminate any concern about an employee suffering psychological trauma.

    • > and it would all but eliminate any concern about an employee suffering psychological trauma.

      I doubt this. If they were all images that this person happened to be into, maybe... But even then, I think it would likely make their addiction to child porn worse, which is it's own psychological problem that is probably worse for society than the trauma suffered by current employees. What happens when they leave that job, and are used to seeing hundreds of cp images a day?

      Not to mention, that some of the stuff the scanner would be looking at would probably be horrific, and violent. Looking at that kind of thing all day would probably have similar psychiatric affects on pedophiles and non pedophiles. In the worst case, it might cause some pedophiles to start to like the worse images out of boredom from seeing so much cp.

      Overall, I'd say this is would just be a bad avenue to go down.

      3 replies →

  • Your terrifying idea mischaracterizes the nature of false positives. Any photo in your library is equally liable to be a false positive as any other; the perceptual hash is not looking for similar images by the metric of what you find similar (content). That’s also the underlying idea behind why people have been able to turn arbitrary images into adversarial false positives.

    • So that picture of my driver's license I took for an ID check or that sensitive work document I scanned with my phone are just as likely to be sent? Great.

      10 replies →

"just making sure the files you upload to them aren't illegal"

the problem lies in this sentence. 1) this happens on device before they're uploaded, which is a monumental shift for a company that claims to be pro-privacy 2) they're now saying they're willing to surveil photos for governments, the reason is sorta irrelevant. they're opening pandora's box - are they going to start scanning files on behalf of the RIAA or other copyright stuff now?

I'm curious could you unpack what this means for me? "big proponent of privacy and have always been a Snowden supporter"

Given that, my assumption is this would click for you, but as you said it doesn't. What does being a big proponent mean to you? How do you support Snowden? What's important to you about privacy? Curious to hear your logic, I bet there are tons of people who have the same concerns (or lack of).

"just"

And who gets to decide what is illegal?

  • Laws?

    • Which laws?

      I'm uncomfortable enough with US laws or Australian laws deciding what images are "illegal" to have on my phone.

      But China also have laws. And Iraq. And Saudi Arabia.

      And Afghanistan had one set of laws last month, but new different laws since then.

      Multinational Corps may like to think they've lifted themselves above petty regional political and legal pressure from elected governments and regional law enforcement. But Apple is 100% capable of being destroyed by either of China or the USA. Anyone who thinks otherwise is fooling themselves. If you shut down Apples manufacturing in China, or head office and server farms in the US, they'd be finished.

      2 replies →

    • Yes, and that's the problem. If you think that the laws protect you, an average iPhone user, you are sadly mistaken. Laws are dictated by the highest bidder, and by those in power. If they want to go after gays, or Jews, or Blacks, or Moslems, or immigrants, or political activists, or whoever, do you think the laws are going to protect or remain neutral to those groups?

Thank you for expressing this opinion. I know it’s not a popular one; but I’m 100% with you.

Time has proven otherwise. All censorship systems start with protecting the kids, and grow to eventually encompass all undesirable content…

  • How have Google’s, Facebook’s, or Microsoft’s CSAM scanning grown?

    • Facebook filters and censors content on a massive scale. They even block private messages with certain keywords or domains

    • To be fair, Google and Facebook both do plenty of content scanning, but you’re right that it likely doesn’t use the CSAM pipelines. Apple’s pipeline is probably even harder to repurpose since it’s designed very closely for this use case.

    • Unknown re private scanning, however Australia and UK have rigorous internet censorship regimes that block copyright infringement and gambling that use exactly the same censorship infrastructure that was build to block CSAM in the early 00s.

      1 reply →