← Back to context

Comment by 201984

5 hours ago

And what if that customer wants to run their own firmware, ie after the manufacturer goes out of business? "Security" in this case conveniently prevente that.

Tradeoffs. Which is more likely here?

1. A customer wants to run their own firmware, or

2. Someone malicious close to the customer, an angry ex, tampers with their device, and uses the lack of Secure Boot to modify the OS to hide all trace of a tracker's existence, or

3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself

Apple uses #2 and #3 in their own arguments. If your Mac gets hacked, that's bad. If your iPhone gets hacked, that's your life, and your precise location, at all times.

  • 1. P(someone wants to run their own firmware)

    2. P(someone wants to run their own firmware) * P(this person is malicious) * P(this person implants this firmware on someone else’s computer)

    3. The firmware doesn’t install itself

    Yeah I think 2 and 3 is vastly less likely and strictly lower than 1.

    • As an embedded programmer in my former life, the number of customers that had the capability of running their own firmware, let alone the number that actually would, rapidly approaches zero. Like it or not, what customers bought was an appliance, not a general purpose computer.

      (Even if, in some cases, it as just a custom-built SBC running BusyBox, customers still aren't going to go digging through a custom network stack).

    • I encourage you to re-evaluate this. How many devices do you (or have you) own which have have a microcontroller? (This includes all your appliances, your clocks, and many things you own which use electricity.) How many of these have you reflashed with custom firmware?

      Imagine any of your friends, family, or colleagues. (Including some non-programmers/hackers/embedded-engineers) What would their answers be?

    • As if the monetary gain of 2 and 3 never entered the picture. Malicious actors want 2 and 3 to make money off you! No one can make reasonable amounts of money off 1.

    • On Android, according to the Coalition Against Stalkerware, there are over 1 million victims of deliberately placed spyware on an unlocked device by a malicious user close to the victim every year.

      #2 is WAY more likely than #1. And that's on Android which still has some protections even with a sideloaded APK (deeply nested, but still detectable if you look at the right settings panels).

      As for #3; the point is that it's a virus. You start with a webkit bug, you get into kernel from there (sometimes happens); but this time, instead of a software update fixing it, your device is owned forever. Literally cannot be trusted again without a full DFU wipe.

      5 replies →

  • #2 and #3 are fearmongering arguments and total horseshit, excuse the strong language.

    Should either of those things happen the bootloader puts up a big bright flashing yellow warning screen saying "Someone hacked your device!"

    I use a Pixel device and run GrapheneOS, the bootloader always pauses for ~5 seconds to warn me that the OS is not official.

    • Yes. They're making the point that your flashing yellow warning is a good thing, and that it's helpful to the customer that a mechanism is in place to prevent it from being disabled by an attacker.

      1 reply →

Then that customer shouldn't buy a device that doesn't allow for their use case. Exercise some personal agency. Sheesh.

  • What happens when there are no more devices that allow for that use case? This is already pretty much the case for phones, it's only a matter of time until Microsoft catches up.