Comment by slg

4 years ago

>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?

What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?

In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.

It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.

What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.

Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are

A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)

B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).

[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...

[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...

  • > What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

    Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.

    • Well for one scanning on-device lets them expand the amount of stuff they search for without an impact on their servers.

      We can all assume they will eventually start scanning for more things than just photos only before they are sent to iCloud. It can easily and _silently_ be expanded to be any file on the phone.

      4 replies →

    • Because they dont have access to everything in the cloud.You dont have to use iCloud, or Siri, or Spotlight.

      This was specifically addressed in the San Benadino and other cases. Apple gave the FBI everything in the cloud. FBI was looking for everything on the device.

      What this change does is all a method, without an opt out option, for them to scan for anything on the device. Be it a string of text/keywords, or certain pictures of a place with certain metadata etc.

      6 replies →

It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.

  • Like I said, Apple controls the hardware, software, and services. They are already control your property.

    • There's a substantial difference - both in theory and in practice - between Apple being capable of making your device do things you don't want it to do, and them actually doing it.

      Saying that just because they could have, we ought to be okay with them actually doing it is nonsense. If you apply that line of thought to a non-updatable product, it becomes pretty clear.

      Pick basically anything man-made around you - your shoes, your couch, whatever. That could have plenty of awful things in it. It could be spying on you, it could be poisoning you, whatever.

      Just because the manufacturer could have done something terrible, doesn't mean we're okay with them actually doing it. The mere fact Apple can do these things after purchase doesn't make it any more acceptable for them to do so.

      2 replies →

  • The concept of ownership you are asserting is but one of many historical principles of ownership. There are however, other concepts of ownership that conflict with what you are asserting.

    https://www.econtalk.org/michael-heller-and-james-salzman-on...

    I don't think there is a good faith argument that Apple is invading or controlling anything of you own. All that's happening is you agree to run the algorithm in exchange for using iCloud photos. That's just a contract; a mutual, voluntary exchange.

    • Contract implies meeting of the minds. I'd like to see the process by which I can line out or alter the terms of the agreement please.

      I'll wait.

      This is the other thing "digital natives" don't get, nor want to. Negotiation is normal. Ya know something else tgey don't get? Selling something with the damn manual, and enough system documentation to actually be able to sit down and learn something. Drives me nuts.

      4 replies →

    • Among the "historical principles of ownership" are those from the communist countries, where the individual humans had the legal right to own only things belonging to a very short list and nothing else.

      However, USA has claimed during decades that such restrictions of the rights of ownership are an abomination.

      Even if we would accept that this is just a modification of a contract between the "owner" of a device and Apple, if Apple would have acted in good faith, they should have offered that if you do not agree to let Apple run programs on your own device for their benefit, which was never mentioned when you "bought" the device, then Apple should fully refund everything that you paid for your device and other Apple services, so that you will be able to get an alternative device.

      As it is now, you either accept that Apple changes their "contract" at any time as they like, or you incur a serious financial loss if you do not accept and you want an alternative.

      This certainly isn't a "mutual, voluntary exchange".

    • the problem is that the companies make ilegal alternatives, obfuscated legals terms, and put himself in the least resistant position, and force you to opt in.

  • Apple is renting the phone to you for $1000 down and $0 a month (unless you actually are financing). Therefore, they are the landlord and, given notice, can change the property as they feel fit.

    • This is demonstrably not true. If you rent a home and then burn it down, you are going to be held liable to the owner of the home. In the case of your phone, no one, including Apple, cares if you buy it and then immediately smash it on the ground and destroy it.

      Apple controls the software that runs on it but there is nothing that stops you from modifying or hacking it to your heart's content if you are able to, just as they are not obligated to make that an easy task for you.

      12 replies →

  • I agree we are all doomed, but I don’t agree it has that much with digital native or not to do. My boomer grandparents, my gen x parents and my millennial self, we are all affected by this. And gen z (the first generation of digital natives), and whatever comes after gen z, is not to blame for that. Reducing it to a generational thing is silly.

    • I think the point was that the digital natives and the next generation of digital natives coming will not know any different and will thus tacitly accept it.

I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.

The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.

Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.

It’s just how life and politics work.

  • The door has been opened for quite some time. What do you think spotlight is? It scans an indexes all your data.

    What's prevented the government from saying "hey if you see Osama Bin Laden in a spotlight scan, you need to send us all that guys data."

    The answer is, Apple can just say FU. And that's exactly what will happen here. In particular, the US DOJ needs to stay in Apple's good graces here and not be overly aggressive. If DOJ pulls any funny business, that's a pretty good reason for Apple to just say "OK, we're picking up our toys and going home. You get nothing now and we're turning on E2EE."

    • I'm not a security professional by any means, but this has been my line of thinking on this whole debate for quite a while. It's pretty silly considering what has been made public about the clandestine operations of alphabet agencies,(if you were paying attention to the right channels[1] their was good reason to believe that the 4th amendment was a joke to the Feds long before Snowden's leaks) especially combined with the existences and complete opaqueness of secret FISA Court. Its kinda crazy to me that all these technologists, and especially those on *hacker*news really believe that you have any sort of privacy from the US government, who has demonstrated it can act with complete impunity in most parts of the world for decades. I say especially people here because they should know how just a handful of rogue actors in any given organization could subvert any sort of veil of privacy. I'm not an expert by any means, but it makes complete sense to me that privacy in any large organization is a very delicate thing to maintain when your adversary is as sophisticated and belligerent as the US security and intelligence apparatus appears to be. Maybe I'm just not privy to something, but it seems like if the US national security apparatus want to do something on our or allies soil, they'll find a way.

      [1]https://www.pbs.org/wgbh/frontline/film/homefront/ - aired 15-5-07 and covered the notorious ATT room 641a

    • I’m sorry, but there is a world of difference between locally indexing files for local search and tagging files as contraband so that they can be reported to the government.

      5 replies →

    • DOJ can pressure Visa, Mastercard and Amex to stop processing payments for Apple. Due to how the international payments systems work, that's a global sanction, even if Apple had no footprint in US.

      And before you claim that's absurd and impossible, there is precedent for US doing just that.[0]

      EDIT: There is also an earlier precedent, UIGEA - https://en.wikipedia.org/wiki/Unlawful_Internet_Gambling_Enf...

      0: https://www.cnet.com/tech/services-and-software/credit-card-... (yes, the blockade was lifted later but my point was that the nuclear option is available)

      1 reply →

This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".

How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?

  • It’s not a non sequitur. The comment is engaging with a series of rhetorical questions that imagine a slippery slope by observing that very little has changed about the trust model between iPhone users and their devices. If you are convinced Apple is slipping, then it is worthwhile to be able to answer how their position today is different than it was last month. That is of course a different question than whether their position last month was acceptable, and maybe people are realizing it was not.

    As a concrete example, if you think the proposal introduces new technical risks, then if Apple announces they made a mistake and will instead scan entirely on the server, you may be satisfied. However, I’d argue that since no new technical risk has been introduced, your conclusions should not change.

    I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control has shifted the Overton window more than what was actually proposed. Politicians who are none the wiser probably believe that’s what Apple actually built, even though it’s not.

    • I disagree - it's a distraction from the larger issue at hand.

      > I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control

      That's a strawman - few if any are arguing that the system will read all of your files out of the gate.

      >since no new technical risk has been introduced,

      This assumption doesn't reflect reality. Introducing a brand new system built specifically for client-side scanning absolutely adds technical risks, if nothing else then by the sheer fact that it's adding another attack vector on your phone. Not to mention the fact that all it would take is a change in policy and a few trivial updates (a new event trigger, directory configs, etc) for this system to indeed scan any file on your device.

      1 reply →

What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Simple: Money.

Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).

That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.

So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".

But now that #1 occurred, it will normalize this nonsense and pave the way for #2.

  • Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

    No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

    Government does not care one bit about how much it costs or if it is even possible. They demand the data with an ultimatum: deliver it as we requested by our deadline or we send in our IT people to take it. Sorry (not) if it takes your whole company down while we plugin in our own servers in your datacenter to take your data.

    • Doesn't work if the data of interest is not there for the taking. And a judge will not compel beyond what they consider reasonable. Having the feature already in place dramatically shifts the bar.

  • Their response to such demands has not been we are technically incapable of doing what’s requested. The demand from the FBI in the San Bernardino case was a very small change to passcode retry constants, because the terrorist’s device did not have a Secure Element.

The politics of it is very different, and that's where the danger lies:

https://news.ycombinator.com/item?id=28239506

I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.

  • The reason we're focused on the state of it now is that we can switch at any time - especially if those barriers are shown to be ineffective or are removed at some point.

There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.

But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.

  • Why would Apple start shrugging now when they've been fighting the FBI in court?

    • One reason is that they weren't under antitrust scrutiny in 2016 when they fought the the government in court.

      Their incentives have changed - they now have the real looming threat of being broken up by governments, so it is now in their interest to comply with anything else governments ask them to do.

      6 replies →

> They could have already scanned our files because they already have full control over the entire ecosystem.

Apple barely submits any CSAM[0]:

> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.

0: https://www.hackerfactor.com/blog/index.php?/archives/929-On...

The existing law and user agreement also sets your rights.

Apple might have done what you say last month, illegally.

Now it's in the user agreement and they can do it legally, at scale.

This literally creates a precedent.

Nothing except Apple saying you could trust them. People were stupid enough to accept that and now even the trust is gone.

That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.

  • …in the same way any existing feature of iOS that makes device data available to Apple (eg iCloud Backup) “owns” the endpoint, no? What’s to stop a malicious Apple from turning on iCloud Backup for all its users and hoovering up your Signal messages database and iCloud Keychain?

    • Nothing. iOS even defaults autoupdate to on, so Apple could do this without your interaction today.

> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.

With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.

  • > no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

    This seems like the easiest thing out of the lot to verify.

    The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.

    If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.

    None of these seem particularly easy to disguise.

    Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.

    Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.

  • How do you think Apple will increase the scope of what’s scanned without every person with Ghidra skills not noticing?

    • If the exchange with Apple is encrypted / interleaved with other traffic to icloud, how would you know that there isn't new classes of scanning being done?

      I'll be very surprised if similar tech is not lobbied for as a backstop to catch DRM-free media files played on devices we "own".

      And, it seems far more probable than not that police will demand this capability be used to help address more crimes. The problem here is that crimes can mean speaking out against an oppressive regime. Being targeted for having the wrong political views (think McCarthyism in the United States or the US backed murder of a million people in Indonesia for affiliating with the "wrong" political party). Etc. History is awash with political abuse of "out groups" perpetrated by tin pot dictators all the way to presidents and PMs of major world powers.

      And, it sets the precedent that e2e encryption is not an excuse for a provider to not provide private customer data to the authorities-- a back-door can be installed, "Just do what Apple did."