Comment by robbiet480

4 years ago

This has worrying privacy implications. I hope Apple makes a public announcement about this but wouldn’t be surprised if they don’t. I also would expect EFF will get on this shortly.

What are the implications?

  • To quote another tweet from Matthew Green, the author of the Twitter thread (https://twitter.com/matthew_d_green/status/14231103447303495...):

    > Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.

    > That’s the message they’re sending to governments, competing services, China, you.

    • Is it? That’s just something the tweets have read in.

      The message could equally well be ‘We won’t become an easy political target by ignoring a problem something most people care about like child porn, but we are going to build a point solution to that problem, so the public doesn’t force us to bow government surveillance requests.’

      It’s easy to nod along with an anti-Apple slogan, but we need to consider what would happen if they didn’t do this.

      4 replies →

  • False positives, what if someone can poison the set of hashes, engineered collisions, etc. And what happens when you come up positive - does the local sheriff just get a warrant and SWAT you at that point? Is the detection of a hash prosecutable? Is it enough to get your teeth kicked in, or get you informally labeled a pedo by your local police? On the flip side, since it's running on the client, could actual pedophiles use it to mutate their images until they can evade the hashing algorithm?

    • False positives are clearly astronomically unlikely. Not a real issue.

      Engineered collisions seem unlikely too. Not impossible. Unless there is a straight up cryptographic defect in the hash algorithm, it seems hard to see how engineered collisions could be made to happen at any scale.

      11 replies →

  • A country can collect a list of people sharing any content they put on a hash list.

    Like gay porn, 'save Khashoggi' meme, or a photo from documentary about missing Uighurs.

    It's hard to imagine how this could be misused, right?

    • That seems like a real problem, and of course it could be misused, however nothing so far revealed actually tells us whether it is possible.

      E.g. how the hashes are computed, where they come from, and what happens when a positive match is detected.

      Until we have a clear understanding of these things, the rest is just speculation.

      2 replies →