← Back to context

Comment by 3pt14159

4 years ago

I'm really conflicted about this.

For context, I deeply hate the abuse of children and I've worked on a contract before that landed 12 human traffickers in custody that were smuggling sex slaves across boarders. I didn't need to know details about the victims in question, but it's understood that they're often teenagers or children.

So my initial reaction when reading this Twitter thread was "let's get these bastards" but on serious reflection I think that impulse is wrong. Unshared data shouldn't be subject to search. Once it's shared, I can make several cases for an automated scan, but a cloud backup of personal media should be kept private. Our control of our own privacy matters. Not for the slippery slope argument or for the false positive argument, but for its own sake. We shouldn't be assuming the worst of people without cause or warrant.

That said, even though I feel this way a not-small-enough part of me will be pleased if it is deployed because I want these people arrested. It's the same way I feel when terrorists get captured even if intelligence services bent or broke the rules. I can be happy at the outcome without being happy at the methods, and I can feel queasy about my own internal, conflicted feelings throughout it all.

Having known many victims of sexual violence and trafficking, I feel for the folks that honestly want that particular kind of crime to stop. Humans can be complete scum. Most folks in this community may think they know how low we can go, but you are likely being optimistic.

That said, law enforcement has a nasty habit of having a rather "binary" worldview. People are either cops, or uncaught criminals. ..and they wonder why they have so much trouble making non-cop friends (DISCLAIMER: I know a number of cops).

With that worldview, it can be quite easy to "blur the line" between child sex traffickers, and parking ticket violators. I remember reading a The Register article, about how anti-terrorism statutes are being abused by local town councils to do things like find zoning violations (for example, pools with no CO).

Misapplied laws can be much worse than letting some criminals go. This could easily become a nightmare, if we cede too much to AI.

And that isn't even talking about totalitarian regimes, run by people of the same ilk as child sex traffickers (only wearing Gucci, and living in palaces).

”Any proposal must be viewed as follows. Do not pay overly much attention to the benefits that might be delivered were the law in question to be properly enforced, rather one needs to consider the harm done by the improper enforcement of this particular piece of legislation, whatever it might be.”

-Lyndon B. Johnson

  • > People are either cops, or uncaught criminals. ..and they wonder why they have so much trouble making non-cop friends (DISCLAIMER: I know a number of cops).

    Ehh let's not make a habit of asserting anecdote as fact, please. Saying you know cops is like saying you know black people and that somehow it affords you some privilege others do not possess.

    This is a weak and ad-hom argument.

    • > is like saying you know black people and that somehow it affords you some privilege others do not possess.

      Of course it does. Interacting with black people (or any race) affords you insight into their life experiences, struggles, worldview etc...

      Of course sociological discourse is highly subjective but this attitude on HN that anecdotal data has no value whatsoever is silly. Do you seriously expect every fact of every people to be published in some infallible academic journal?

      2 replies →

    • I really appreciate having my words taken out of context, and wrapped in insults.

      I probably could have done without the second sentence, but the first stands.

      Have a nice day.

      1 reply →

> I'm really conflicted about this.

I'm not. I am very unambiguously against this and I think if word gets out Apple could have a real problem.

I would like to think I am against child porn as any well-adjusted adult. That does not mean I wish for all my files to be scanned without my consent or even knowledge for compliance, for submission to who knows where matching to who knows what reporting to, well, who knows.

That's crossing a line. You are now reading my private files, interpreting them, and doing something based on that interpretation. That is surveillance.

I am very not OK with this.

If you truly want to "protect the children" you should have no issue for the police to visit and inspect your, and all of your neighbors houses. Every few days. Unannounced, of course. And if you were to resist, you MUST be a pedophile who is actively abusing children in their basement.

  • I'm actually more OK with unannounced inspections of my basement (within reason) than with some government agents reading through my files all the time.

    • Why just your basement? While they're there they might kindly ask you to unlock that computer or hand over that phone. Just to make sure, of course.

- Don't you want to get the terrorists?

- Yea yeah

- Great. Give me access to every part of your life so i know you're not a terrorist.

  • “If you want a vision of the future, imagine a boot stamping on a human face - forever.” I always think about this Orwell quote and think it’s up to us to try to fight for what is good, but we were too busy doom-scrolling on Twitter to do anything about it.

The NCMEC database that Apple is likely using to match hashes, contains countless non-CSAM pictures that are entirely legal not only in the U.S. but globally.

This should be reason enough for you to not support the idea.

From day 1, it's matching legal images and phoning home about them. Increasing the scope of scanning is barely a slippery slope, they're already beyond the stated scope of the database.

  • The database seems legally murky. First of all, who would want to actually manually verify that there aren't any images in it that shouldn't be? If the public can even request to see it, which I doubt, would you be added to a watch list of potentially dangerous people or destroy your own reputation? Who adds images to it and where do they get those images from?

    My point is that we have no way to verify the database wouldn't be abused or mistaken and a lot of that rests on the fact that CSAM is not something people want to have to encounter, ever.

    • It’s a database of hashes, not images, though, right? I would argue the hashes absolutely should be public, just as any law should be public (and yes, I am aware of some outrageously brazen exceptions to even that).

      Anyone should be able to scan their own library against the database for false positives. “But predators could do this too and then delete anything that matches!” some might say, but in a society founded on the presumption of innocence, that risk is a conscious trade-off we make.

      1 reply →

  • Could you say more about these legal photos? That's a pretty big difference from what I thought was contained in the DB.

    • If there are pictures recovered alongside CSAM but are not CSAM themselves they can be included in the database.

      The thing I can publicly say is that the database is not strictly for illegal or even borderline imagery.

      NCMEC try to keep the contents, processes and access to the database under wraps for obvious reasons.

      12 replies →

    • I would imagine it would include things like Nirvana's Nevermind album-cover, or a better example Scorpion's album cover for Virgin Killer.

      1 reply →

  • NCMEC is an private organization created by the U.S. Government, funded by the U.S. Government, operates with no constitutional scrutiny, operates with no oversight / accountability, could be prodded by the U.S. Government, and they tell you to "trust them".

  • To be fair the Twitter thread says (emphasis mine) "These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear."

    I don't know what the cutoff is, but it doesn't sound like they believe that possession of a single photo in the database is inherently illegal. That doesn't mean this is overall a good idea. It simply weakens your specific argument about occasional false positives.

Since you worked on an actual contract catching these sorts of people you are perhaps in a unique position to answer the question: will this sort of blanket surveillance technique in general but also in iOS specifically - actually work to help catch them?

  • I have direct knowledge of examples of where individuals were arrested and convicted of sharing CP online and they were identified because a previous employer I worked for used PhotoDNA analysis on all user uploaded images. So yeah, this type of thing can catch bad people. I’m still not convinced Apple doing this is a good thing, especially on private media content without a warrant, even though the technology can help catch criminals.

    • now im afraid, i have two young children < 5 years old. i have occasionally took pictures of them naked with some bumps on the skin or mosquito bite and sent them to my wife over whatsapp to look at and decide do we need to send them to doctor, do i have to fear now that i will be marked as distributing CP.

      22 replies →

    • Look at all the recent findings that have come to light regarding ShotSpotter law enforcement abuse [1] These systems, along with other image and object recognition projects are rife for false positives, bias, and garbage-in-garbage-out. They should in no way be considered trustworthy for criminal accusations let alone arrest.

      As mentioned in the twitter thread, how does image hashing & recognition tools such as PhotoDNA handle adversarial attacks?[2][3]

      [1] https://towardsdatascience.com/black-box-attacks-on-perceptu...

  • Just as being banned from one social media platform for bad behavior pushes people to a different social media platform, this might very well push the exactly wrong sort of people from iOS to Android.

    If Android then implements something similar, they have the option to simply run different software, as Android lets you run whatever you want so long as you sign the wavier.

    "You're using Android?! What do you have to hide?" -- Apple ad in 2030, possibly

  • I'm the person you're responding to, and I think so? My contract was on data that wasn't surveilled, it was willingly supplied in bad faith. Fake names, etc. And there was cause / outside evidence to look into it. I can't really go into more details than that, but it wasn't for an intelligence agency. It was for another party that wanted to hand something over to the police after they found out what was happening.

    • I see. I was responding to you, yes. And in this case I was more curious about your opinion - based on your previous knowledge - on the viability of Apple’s technology here, rather than the specific details of your work.

      In my (uninformed) opinion - this looks like more of a bad faith move on Apples part that will maybe catch some bad actors but will be a net harmful result for apple’s users and society, as expressed in the Twitter thread.

      Others who responded here though also seem to think it’ll be a viable technique.

This scanning doesn't prevent the actual abuse and all this surveillance doesn't get to the root of the problem but can be misused by authoritarian governments.

It's a pandoras box. You wouldn't allow the regular search of your home in real life.

Let me help you, then, because you shouldn't be conflicted at all.

"Think of the children" is how they force changes that would otherwise be unconscionable.

They've done it with encryption and anonymity for years. Now they're doing it with the hardware in your pocket.

I'm not at all conflicted about this. Obviously CSAM is bad and should be stopped, but it is inevitable this new feature will become a means for governments to attack material of far more debatable 'harm'.

Is this confirmed yet or just something someone believes will happen? How credible are the sources of this twitter account?

Well, it's a lot like everything. No one wants abusers, murderers, and others out and about. But then, we can't search everyone's homes all of the time for dead bodies, or other crimes.

We would all be better off without these things happening, and anyone would want less of it to happen.

> Unshared data shouldn't be subject to search

Since they are only searching for _known_ abusive content, by definition they can only detect data that has been shared, which I think is the important point here.

There have been child abuse victims who have openly condemned this sort of intrusion on privacy, although they obviously don't speak for them all.

> Unshared data shouldn't be subject to search. Once it's shared, I can make several cases for an automated scan, but a cloud backup of personal media should be kept private. Our control of our own privacy matters. Not for the slippery slope argument or for the false positive argument, but for its own sake. We shouldn't be assuming the worst of people without cause or warrant.

I have a much simpler rule: Your device should never willingly* betray you.

*With a warrant, police can attempt to plant a bug, but your device should not help them do so.

  • I don't think this rule makes any sense, because it just abstracts all the argument into the word "betray".

    The vast majority of iPhone users won't consider it a betrayal that they can't send images of child abuse, any more than they consider it a betrayal that it doesn't come jailbroken.

    The victims of child abuse depicted in these images may well have considered it a betrayal by Apple that they allowed their privacy to be so flagrantly violated on their devices up until now.

    • I don't think you read your ancestor post carefully enough. I at least don't see any room for ambiguity.

      The rule is that your (note the emphasis) device won't ever willingly betray you. There's nothing here that implicates the majority in any way. Simply, your own device should never work against you.

      This actually sounds like a great rule to prevent this kind of authoritarian scope creep.

    • > The vast majority of iPhone users won't consider it a betrayal that they can't send images of child abuse

      Probably neither would child abusers, since as soon as they send an image of child abuse, they're much more likely to be caught than if it had stayed on their phone.

      > a betrayal by Apple that they allowed their privacy to be so flagrantly violated on their devices up until now.

      "Their" devices? Once Apple sells an iPhone, it no longer belongs to Apple. Taking "betrayal" to mean "didn't plant backdoors on other people's computers to catch your abusers" is stretching that word far beyond reason.