← Back to context

Comment by croutonwagon

4 years ago

In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.

It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.

What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.

Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are

A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)

B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).

[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...

[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...

> What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.

  • Well for one scanning on-device lets them expand the amount of stuff they search for without an impact on their servers.

    We can all assume they will eventually start scanning for more things than just photos only before they are sent to iCloud. It can easily and _silently_ be expanded to be any file on the phone.

  • Because they dont have access to everything in the cloud.You dont have to use iCloud, or Siri, or Spotlight.

    This was specifically addressed in the San Benadino and other cases. Apple gave the FBI everything in the cloud. FBI was looking for everything on the device.

    What this change does is all a method, without an opt out option, for them to scan for anything on the device. Be it a string of text/keywords, or certain pictures of a place with certain metadata etc.

    • This is just speculation. Current technical implementation limits scan only for images to be uploaded into cloud, which can be opted. If you don’t trust that, you can’t trust to use their devices right now either.

      5 replies →