Comment by secondo
4 years ago
It's quite easy to extrapolate this and in a few steps end up in a boring dystopia.
First it's iPhone photos, then it's all iCloud files, that spills into Macs using iCloud, then it's client side reporting of local Mac files, and somewhere along all other Apple hardware I've filled my home with have received equivalent updates and are phoning home to verify that I don't have files or whatever data they can see or hear that some unknown authority has decided should be reported.
What is the utopian perspective of this which counterbalances the risks for this to be a path worth taking?
> What is the utopian perspective of this which counterbalances the risks for this to be a path worth taking?
Apple takes care of everything for you, and they have your best interests at heart. You will be safe, secure, private and seamlessly integrated with your beautiful devices, so you can more efficiently consume.
What's not to like about a world where child crime, terrorism, abuse, radical/harmful content and misinformation can be spotted in inception and at the source and effectively quarantined?
No one here has a problem with the worst criminals being taken out. The problem is the scope creep that always comes after.
In 2021 and 2020 we saw people being arrested for planning/promoting anti lockdown protests. Not for actually participating but for simply posting about it. The scope of what "harmful content" is is infinite. You might agree that police do need to take action against these people but surely you can see how the scope creeped from literal terrorists and pedophiles to edgy facebook mums and how that could move even further to simple criticisms of the government or religion.
It's difficult to say how we draw the line to make sure horrible crimes go punished while still protecting reasonable privacy and freedom. I'm guessing apples justification here is that they are not sending your photos to police but simply checking them against known bad hashes and if you are not a pedophile, there will be no matches and none of your data will have been exposed.
We also saw the police query "check-in" databases which were pitched to the public as "for contact-tracing purposes only". Scope creep is inevitable.
9 replies →
We also saw HN shadow banning entire IP CIDR blocks because they didn’t like argument against fleeting CDC guidance that was put forth or the Chinese lab origin theory in 2020. You can’t register from these CIDR blocks. If you had an account before the comments would just end up in a black hole. Dang can explain.
4 replies →
Who will be accountable for the creeps at apple? Or their overlords in the government?
I mean, Apple isn't too far from the Mac thing you mention. Since Catalina running an executable on macOS phones home and checks for valid signatures on their servers.
No, this is entirely different.
> What is the utopian perspective of this
You will make Apple tons of money.
> It's quite easy to extrapolate this and in a few steps end up in a boring dystopia.
It's only boring until we get another Hitler or equivalent.
> "What is the utopian perspective of this which counterbalances the risks for this to be a path worth taking?"
Basically victims of rape don't want imagery of their rape freely distributed as pornography. They consider that a violation of their rights.
It's interesting how many users in this thread are instinctively siding with the offenders in this, and not the victims. Presumably because they made it through their own childhoods without having imagery of their own abuse shared online.
You are actually creating a false dichotomy here. There are more sides to this. And you are creating (as said a false) black and white image here.
I strongly believe that nobody wants to further victimize people by publicly showing images of their abuse.
And I believe very strongly that putting hundreds of millions of people under blanket general suspicion is a dangerous first step.
Imagine if every bank had to search all documents in safe deposit boxes to see if people had committed tax evasion (or stored other illegal things like blood diamonds obtained with child labor). That would be an equivalent in the physical world.
Now add to this, as discussed elsewhere here, that the database in question contains not only BIlder of victims, but also perfectly legal images. This can lead to people "winning" a house search because they have perfectly legal data stored in their cloud.
Furthermore, this means that a single country's understanding of the law is applied to a global user community. From a purely legal point of view, this is an interesting problem.
And yes: I would like to see effective measures to make the dissemination of such material more difficult. At the same time, however, I see it as difficult to use a tool for this purpose that is not subject to any control by the rule of law and cannot be checked if the worst comes to the worst.
Using your bank analogy for a second: banks already do report on activity to authorities who can then identify people to investigate based on patterns. I've heard that large transactions (>10k) or near-sized ones are flagged.
A great deal of skepticism is being given to the NCMEC database in these comments, which I'm surprised by as from what information I have I think this is being exaggerated. At the same time we have no idea whether Apple would even be using that database or another one that they may have created themselves.
1 reply →
> It's interesting how many users in this thread are instinctively siding with the offenders in this, and not the victims.
That is infantile. Painting people advocating privacy as siding with offenders is highly insulting.
This is a situation where different people's privacy is in conflict. What's infantile is claiming sole ownership of privacy advocacy while so-whating the worst privacy violation imaginable, from the victims' perspective.
3 replies →
I feel it's a little disingenuous to describe millions of innocent people being surveilled as "the offenders" because there are a handful of actual offenders among them.
I didn't do that...?
There's a small number of victims, a small number of offenders (but much more than "a handful"), and hundreds of millions of other users. This change is in the direct interest of victims, direct opposition to offenders.
Most normal people probably support the measures in solidarity with group 1, HN generally doesn't.
6 replies →
Presumptuous. I certainly dont want this pseudo-righteous power grab done for me.
What about the victims of the apple employees and government officials that exploit this?
It is a violation of their rights. But we have a justice system set up which makes distributing such images knowingly (and committing such acts with the intent to distribute those images) a crime.
It's also incredibly likely this could be used to send people you don't like to prison by sending their phone innocuous images that look like wrong images to an AI.
It's also also incredibly likely this will evolve into scanning for more than just abuse photos, especially in the hands of governments around the world.