← Back to context

Comment by ChrisKnott

4 years ago

> "What is the utopian perspective of this which counterbalances the risks for this to be a path worth taking?"

Basically victims of rape don't want imagery of their rape freely distributed as pornography. They consider that a violation of their rights.

It's interesting how many users in this thread are instinctively siding with the offenders in this, and not the victims. Presumably because they made it through their own childhoods without having imagery of their own abuse shared online.

You are actually creating a false dichotomy here. There are more sides to this. And you are creating (as said a false) black and white image here.

I strongly believe that nobody wants to further victimize people by publicly showing images of their abuse.

And I believe very strongly that putting hundreds of millions of people under blanket general suspicion is a dangerous first step.

Imagine if every bank had to search all documents in safe deposit boxes to see if people had committed tax evasion (or stored other illegal things like blood diamonds obtained with child labor). That would be an equivalent in the physical world.

Now add to this, as discussed elsewhere here, that the database in question contains not only BIlder of victims, but also perfectly legal images. This can lead to people "winning" a house search because they have perfectly legal data stored in their cloud.

Furthermore, this means that a single country's understanding of the law is applied to a global user community. From a purely legal point of view, this is an interesting problem.

And yes: I would like to see effective measures to make the dissemination of such material more difficult. At the same time, however, I see it as difficult to use a tool for this purpose that is not subject to any control by the rule of law and cannot be checked if the worst comes to the worst.

  • Using your bank analogy for a second: banks already do report on activity to authorities who can then identify people to investigate based on patterns. I've heard that large transactions (>10k) or near-sized ones are flagged.

    A great deal of skepticism is being given to the NCMEC database in these comments, which I'm surprised by as from what information I have I think this is being exaggerated. At the same time we have no idea whether Apple would even be using that database or another one that they may have created themselves.

    • > I've heard that large transactions (>10k) or near-sized ones are flagged.

      Thi sis transmission of funds and there are laws regulating the monitoring of those.

      I used bank vaults were you put things into the vaults without the bank often times knowing what is in there. If they knew, they would need to report to authorities.

      So Apple doing this scan would be the bank opening all vaults, scanning the contents and reporting things to the IRS (I think this is the tax thing in the US if I am not mistaken - in Germany it would be the Finanzamt).

> It's interesting how many users in this thread are instinctively siding with the offenders in this, and not the victims.

That is infantile. Painting people advocating privacy as siding with offenders is highly insulting.

  • This is a situation where different people's privacy is in conflict. What's infantile is claiming sole ownership of privacy advocacy while so-whating the worst privacy violation imaginable, from the victims' perspective.

    • That's an interesting point. However, I'm not sure victim privacy is the reason for CSAM regulations. Rather, it's reducing the creation of CSAM by discouraging its exchange. For example, suppose instead of deleting/reporting the images, Apple would detect and modify the images with Deepfake so the victim is no longer identifiable. That would protect the victim's privacy but wouldn't reduce the creation or exchange. The fact that such a proposal is ridiculous suggests that privacy isn't the reason for regulation and that reducing creation and exchange is.

      1 reply →

    • The two sides proposed by your argument are only logically valid opposites if you can logically/mathematically guarantee that this technology will only ever be used for detecting photos depicting blatant and obvious sex abuse. Since you cannot, the entire argument is void. I'm not siding with abusers, I simply want arbitrary spies staying the hell away from my computers.

I feel it's a little disingenuous to describe millions of innocent people being surveilled as "the offenders" because there are a handful of actual offenders among them.

  • I didn't do that...?

    There's a small number of victims, a small number of offenders (but much more than "a handful"), and hundreds of millions of other users. This change is in the direct interest of victims, direct opposition to offenders.

    Most normal people probably support the measures in solidarity with group 1, HN generally doesn't.

    • ...And direct opposition to those hundreds of millions of other users. Trying to fit this to a victims vs. offenders model is a deliberate attempt to turn those hundreds of millions of other users into uninvolved bystanders. They have been pushed out by the lack of space in the model for them and their right to not have their door kicked down based on the results of an algorithm and database they can't audit, which are susceptible to targeted adversarial attacks and authoritarian interference respectively.

      3 replies →

    • Having private devices randomly snooped for forbidden materials is fine, okay. So why limit this to phones?

      There are kidnapped children being locked inside homes. If you don't open your doors and accept weekly full home inspections, I think it's safe to say you support offenders and hate victims if you oppose this. I mean, we're all against people kidnapping and abusing children.

      There's a small number of victims, a small number of offenders (but much more than "a handful"), and hundreds of millions of other home owners. This change is in the direct interest of victims, direct opposition to offenders.

What about the victims of the apple employees and government officials that exploit this?

It is a violation of their rights. But we have a justice system set up which makes distributing such images knowingly (and committing such acts with the intent to distribute those images) a crime.

It's also incredibly likely this could be used to send people you don't like to prison by sending their phone innocuous images that look like wrong images to an AI.

It's also also incredibly likely this will evolve into scanning for more than just abuse photos, especially in the hands of governments around the world.