← Back to context

Comment by gruez

9 hours ago

>How do you square this with Apple's pushback few years back against FBI who asked for a specific individual's details.

See: https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...

>Most of these seek to compel Apple "to use its existing capabilities to extract data like contacts, photos and calls from locked iPhones running on operating systems iOS 7 and older" in order to assist in criminal investigations and prosecutions. A few requests, however, involve phones with more extensive security protections, which Apple has no current ability to break. These orders would compel Apple to write new software that would let the government bypass these devices' security and unlock the phones.[3]

That's much more different than OpenAI dumping some rows from their database. If chatgpt was end-to-end encrypted and they wanted OpenAI to backdoor their app I would be equally opposed.

Interesting that it wound up not being Cellebrite, I thought for years it was, I wonder if Cellebrite had people lie to the press that it was them. Really effective marketing.

I agree, the line is at messing with End to End Encryption. If your E2EE has a backdoor ITS NOT END TO END ENCRYPTION. Thanks.

  • It's not exactly E2EE. iPhone storage is locked with a 6-digit numeric passcode in most cases, which is basically no entropy. The whole thing relies on hardware security (the enclave). At least in older phones, that just meant security through obscurity since Apple's trade secrets were enough to unlock it, but maybe newer ones can't be unlocked even by Apple.

With CALEA and related laws, companies that don't keep logs can be compelled to surveil certain users from that point forward, even if that means installing hardware/software that keeps logs on them.