Comment by kogepathic
1 day ago
> What I am asking for: publish a basic GitHub repo with the hardware specs and connection protocols. Let the community build their own apps on top of it.
This concept works fine for the author's example of a kitchen scale, but fails when the device in question is something like a router that has secure boot with one key burned into e-fuses.
In that case we need both open software and a requirement that the manufacturer escrow signing keys with someone so that after EOL any software can be run.
Forcing the release of signing keys would be a security disaster. The first person to grab the expired domain for the auto update server for a IoT device now gets a free botnet.
The only real way to make devices securely re-usable with custom firmware requires some explicit steps and action to signal that the user wants to run 3rd-party firmware: A specific button press sequence is enough. You need to require the user to do something explicit to acknowledge that 3rd-party software is being installed, though.
Forcing vendors to release their security mechanisms to the public and allow anyone to sign firmware as the company is not what you want, though.
Embedded devices that go on the internet by (to update) themselves are an anti-pattern.
I run a bunch of stuff using Home Assistant via the Zigbee integration - the Zigbee host on the local server gets to decide where to install updates from - which was the security mechanism for most most software for most of history.
Get your stuff from a reputable source. Signage keys are nice, but they don't work as the sole security measure in an unsound supply chain.
The OTA firmware update keys ideally shouldn't be the same as the secure boot keys.
…how do the updates get booted then?
4 replies →
> Forcing the release of signing keys would be a security disaster. The first person to grab the expired domain for the auto update server for a IoT device now gets a free botnet.
Have you seen the state of embedded device security? It is already an unmitigated disaster.
Since you bring up botnets, there are far more exploited security vulnerabilities because a vendor EOLed support (or went bankrupt) and their firmware contained bugs that cannot be fixed because a signed firmware is required, or the source code was not provided than because their signing keys were leaked and someone is distributing malicious updates.
> Forcing vendors to release their security mechanisms to the public and allow anyone to sign firmware as the company is not what you want, though.
Yes, it is what I want. I am perfectly aware of the potential downsides and what I am proposing is worth it. The product is already EOL. In our current era of enshittification, vendor pinky promises to implement a user-bypass in their signed boot chain is not good enough. Look at the Other OS controversy on the PS3 if you want an example of this in practice, or Samsung removing bootloader unlocking in their One UI 8.0 update.
> The only real way to make devices securely re-usable with custom firmware requires some explicit steps and action to signal that the user wants to run 3rd-party firmware: A specific button press sequence is enough. You need to require the user to do something explicit to acknowledge that 3rd-party software is being installed, though.
The vendor has implemented an internal pad on the laser-welded, weather sealed, IP-rated smart watch that must be shorted to disable secure boot. Opening the device to access this will essentially destroy it, but we preserved the vendor's secure boot signing keys so missioned accomplished!
But you can still do both. Put a key into escrow that unlocks the device fully, but the key can only be used if the device is physically manipulated. This could mean holding down a button as it boots ups to put it into “enter the unlock key” mode. The mode is useless until the key is published and the key is useless without physical access to the device. And you don’t need to open anything. This could be a purely software thing. As long as you can somehow externally communicate with the device via a button, Bluetooth, Ethernet, etc. you can create a system that would allow this. Hell, you could use a magnet to trigger it.
I agree that devices shouldn’t be locked by the manufacturer AND I think that silently unlocking all devices all at once could do harm.
> Have you seen the state of embedded device security? It is already an unmitigated disaster.
If security was an unmitigated disaster on every device then it would be trivial to root them all and install your own software, wouldn’t it?
I think that's a fair distinction, and it highlights that "just publish the protocol" isn't sufficient for every class of device
Locked bootloader should just be competely forbidden, even for brand new devices. Hardware and phone owners have the right to make any change they see fit on their device, no matter if the manufacturer thinks it's ok or not.
I agree with you fully on this. Unfortunately, the odds are stacked very unfavorably against us. It's not just the manufacturers who resort to these underhanded profiteering tactics. Even the regulatory agencies are for locking down the firmware.
Their argument is that an unlocked firmware would allow us to override regulatory restrictions like the RF output power or the IMEI number. That argument has some merit. However, my opinion is that such restrictions should be implemented as hardware interlocks that are unchangeable through software. Thus, we would be free to change the software as we like. Sadly, both the manufacturers and the regulatory agencies tend to completely ignore that solution, so that they can retain their excess control.
It's trivially easy to break those restrictions with off the shelf SDR hardware you can buy rather cheaply.
Locking people out of their phone does not raise the skill or effort ceiling much, as there still presumably would be software restrictions in place.
I always found this claim completely bogus, you can always do something illegal with your phone, there's no way to prevent everything with software.
This is the goal of law enforcement and justice in general and in this argument, a hardware manufacturer is substituting this role, when we say that, we can see the overreach. Manufacturers aren't public entities able to make such decisions.
There are security reasons to use locked bootloaders.
But I do agree that we should be able to unlock and relock the bootloader. That's one of the reasons GrapheneOS supports the Google Pixel, for instance. The security model relies on the locked bootloader.
Very few people need a GrapheneOS level of security anyways.
Yeah sure there's a few cases where it make sense but they are few and far between.
4 replies →
How about just allowing key enrollment with a physical button?
This is very much not an option on most embedded devices. They allow one key to be burned once.
IIRC, a certain Marvell SoC datasheet says multiple key slots are supported, but the boot ROM only supports reading the first entry (so really, only one key is supported).
Unless it becomes a law, and the hardware makers adapt.
1 reply →