Comment by bitpush
9 hours ago
How do you square this with Apple's pushback few years back against FBI who asked for a specific individual's details.
I'm not taking sides, but it sounds like if ChatGPT cooperating with LE is a Good Thing (TM), then Apple making a public spectacle of how they are not going to cooperate is .. bad?
I'm fully aware that Apple might not even be able to provide them the information, which is a separate conversation.
>How do you square this with Apple's pushback few years back against FBI who asked for a specific individual's details.
See: https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...
>Most of these seek to compel Apple "to use its existing capabilities to extract data like contacts, photos and calls from locked iPhones running on operating systems iOS 7 and older" in order to assist in criminal investigations and prosecutions. A few requests, however, involve phones with more extensive security protections, which Apple has no current ability to break. These orders would compel Apple to write new software that would let the government bypass these devices' security and unlock the phones.[3]
That's much more different than OpenAI dumping some rows from their database. If chatgpt was end-to-end encrypted and they wanted OpenAI to backdoor their app I would be equally opposed.
Interesting that it wound up not being Cellebrite, I thought for years it was, I wonder if Cellebrite had people lie to the press that it was them. Really effective marketing.
I agree, the line is at messing with End to End Encryption. If your E2EE has a backdoor ITS NOT END TO END ENCRYPTION. Thanks.
It's not exactly E2EE. iPhone storage is locked with a 6-digit numeric passcode in most cases, which is basically no entropy. The whole thing relies on hardware security (the enclave). At least in older phones, that just meant security through obscurity since Apple's trade secrets were enough to unlock it, but maybe newer ones can't be unlocked even by Apple.
With CALEA and related laws, companies that don't keep logs can be compelled to surveil certain users from that point forward, even if that means installing hardware/software that keeps logs on them.
The difference is that in this case OpenAI was able to produce the requested information without compromising security for their other customers.
Right, for the OpenAI case to be analogous, they would have to switch to a system where your chats are homomorphically encrypted -- i.e. OpenAI does all its operations without knowing either the input or output plaintext. In that case, they'd only have encrypted chats to begin with, and would have to somehow get your key to comply with a warrant for the plaintext.
And note: the above scenario is not likely anywhere in the near future, because homomorphic encryption has something like a million times overhead, and requires you to hit the entire database on every request, when state-of-the-art LLM systems are already pushing the limits of computation.
Yes. I'm glad the FBI was able to crack the phone without Apple's help in that San Bernardino case, which humiliated Apple as a little bonus.
Apple also tried to freak the public out saying the FBI wanted a backdoor added, which was inaccurate. You can't retroactively add a backdoor, that's the whole point of it. FBI wanted Apple to unlock a specific phone, which Apple said they were capable of doing already.
With my current knowledge of the case, I'd say Apple was clearly in the moral wrong and it's a pretty dark mark in their past.
My understanding is the suspect was detained and law enforcement was not asking for a dragnet (at least thats what they stated publicly), and they were asking for a tool for a specific phone. Apple stated the FBI was asking them to backdoor in all iPhones, then the FBI countered and said thats not what they were asking for. Apple then marched triumphantly into the moral sunset over the innocent victims'; meanwhile the FBI then send funds to a dubious group with questionable ethics and ties to authoritarian regimes.
In my opinion, Apple should have expediently helped here, if for no other reason than to prevent the funding of groups that support dragnets, but also out of moral obligation to the victims.
Are you certain Apple could unlock this phone (short of making a software change that compromised all iPhones)?
And why would it matter? Even if the capability to create a magic key that unlocked a specific phone remained entirely within a company's hands for future use, why wouldn't the courts just continue to ask them to use it? It's not like the victims of all sorts of other crimes don't have similar don't similarly deserve justice.
Law enforcement at the time was even admitting (which we'd later find out to be correct) that there likely was nothing of value on the phone. It seems fairly obvious that the FBI was trying to use a high profile case to force a paradigm shift. Perhaps we can argue it'd be a good and just one, but arguing that they weren't seems not right.
Apple said they could do it. And they didn't tell the FBI they can't do it, they said they don't want to.
I make no claim either way nor do I have insider knowledge of what they could and could not do.
1 reply →
Seeing how strained your good-faith interpretation is has further entrenched my belief that San Bernadino was a false flag operation by the FBI.
There is no world in which a post-PRISM compliant Apple cannot be coerced by the feds for an investigation. It's just a matter of how much pressure the FBI wanted to apply; Apple's colossal marketing win is the sort of thing that you would invent if you wanted to manufacture consumer trust, not "prove" anything to cryptographers. Playing devil's advocate, "authoritarian regimes" are exactly the sort of place you would send the iPhone to if you already had the information and wanted to pretend like it was hard to access.
If we assume a worst-case-scenario where Apple was already under coercion by the FBI, everything they did covers up any potential wrongdoing. It was all talk, no walk. Neither side had to show any accountability, and everyone can go on happily using their devices for private purposes.