Comment by QuiEgo
6 hours ago
It'd be ideal if the phone manufacturer had a way to delegate trust and say "you take the risk, you deal with the consequences" - unlocking the bootloader used to be this. Now we're moving to platforms treating any unlocked device as uniformly untrusted, because of all of the security problems your untrusted device can cause if they allow it inside their trust boundary.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
> We cant have nice things because bad people abused it :(.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
> That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I don't entirely agree but neither would I be dead set against such an arrangement. Consider that (for example) while private banks are free not to do business with you at least in civilized countries there is a government associated bank that will always do business with anyone. Mobile devices occupy a similar space; there would always need to be a vendor offering user controllable devices. And we would also need legal protections against app authors given that (for example) banking apps are currently picking and choosing which device configurations they will run on.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
>and then a second device if you want to play.
With virtualization this could be done with the same device. The play VM can be properly isolated from the secure one.
How is that supposed to fix anything if I don't trust the hypervisor?
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Then yes you will need a another device. Same if you don't trust the processor.
1 reply →