Comment by piskov

5 hours ago

So that’s how in an event of war US adversaries will be relieved of their devices

> The anti-rollback mechanism uses Qfprom (Qualcomm Fuse Programmable Read-Only Memory), a region on Qualcomm processors containing one-time programmable electronic fuses.

What a nice thoughtful people to build such a feature.

That’s why you sanction the hell out of Chinese Loongson or Russian Baikal pity of CPU — harder to disable than programmatically “blowing a fuse”.

This kind of thing is generally used to disallow downgrading the bootloader once there is a bug in chain of trust handling of the bootloader. Otherwise once broken is forever broken. It makes sense from the trusted computing perspective to have this. It's not even new, it was still there on p2k motorollas 25 years ago.

You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.

  • Trusted computing means trusted by the vendor and content providers, not trusted by the user. In that sense I consider it very evil.

    • If the user doesn't trust an operating system, why would they use it. The operating system can steal sensitive information. Trusted computing is trusted by the user to the extent that they use the device. For example if they don't trust it, they may avoid logging in to their bank on it.

      3 replies →

  • A discussion you don't see nearly enough of is that there is a fundamental tradeoff with hardware security features — every feature that you can use to secure your device can also be used by an adversary to keep control once they compromise you.

    • Not only can, but inevitably is. Security folks - especially in mobile - are commonly useful idiots for introducing measures which are practically immediately coopted to take away users ability to control their device and modify it to serve them better. Every single time.

      We just had the Google side loading article here.

  • I’d like to think I’m buying the device, not a seat to use the device, at least if I do not want to use their software.

    • You can't have that with phones. You are always at the mercy of the hardware supplier and their trusted boot chain that starts with the actual phone processor (the one running GSM stuff, not user interface stuff). That one is always locked down and decides to boot you fancy android stuff.

      The fact that it's locked down and remotely killable is a feature that people pay for and regulators enforce from their side too.

      At the very best, the supplier plays nice and allows you to run your own applications, remove whatever crap they preinstalled and change to font face. If you are really lucky, you can choose to run practically useless linux distribution instead of practically useful linux distribution with their blessing. Blessing is a transient thing that can be revoked any time.

      6 replies →

OTP memory is a key building block of any secure system and likely on any device you already have.

Any kind of device-unique key is likely rooted in OTP (via a seed or PUF activation).

The root of all certificate chains is likely hashed in fuses to prevent swapping out cert chains with a flash programmer.

It's commonly used to anti rollback as well - the biggest news here is that they didn't have this already.

If there's some horrible security bug found in an old version of their software, they have no way to stop an attacker from loading up the broken firmware to exploit your device? That is not aligned with modern best practices for security.

  • > they have no way to stop an attacker from loading up the broken firmware to exploit your device

    You mean the attacker having a physical access to the device plugging in some USB or UART, or the hacker that downgraded the firmware so it can use the exploit in older version to downgrade the firmware to version with the exploit?

    • Sure. Or the supply chain attacker (who is perhaps a state-level actor if you want to think really spicy thoughts) selling you a device on Amazon you think is secure, that they messed with when it passed through their hands on its way to you.

    • > You mean the attacker having a physical access to the device plugging in some USB or UART

      ... which describes US border controls or police in general. Once "law enforcement" becomes part of one's threat model, a lot of trade-offs suddenly have the entire balance changed.

eFuses have been a thing forever on almost all MCUs/processors, and aren't some inherently "evil" technology - mostly they're used in manufacturing when you might have the same microcontroller/firmware on separate types of boards. I'm working on a board right now which is either an audio input or an output (depending on which components are fitted) and one or the other eFuse is burned to set which one it is, so subsequent firmware releases won't accidentally set a GPIO as an output rather than an input and potentially damage the device.

  • Isn't this normally done with a GPIO bootstrap?

    • It depends. Usually there are enough "knobs" that adding that many balls to the package would be crazy expensive at volume.

      Most SoCs of even moderate complexity have lots of redundancy built in for yield management (e.x. anything with RAM expects some % of the RAM cells to be dead on any given chip), and uses fuses to keep track of that. If you had to have a strap per RAM block, it would not scale.

There's so many ways to do this, but a simpler method is to hide a small logic block (somewhere in the 10 billion transistors of your CPU) that detects a specific, long sequence of bits and invokes the kill switch.

>That’s why you sanction the hell out of Chinese Loongson or Russian Baikal

I assume that's also why China is investing so heavily into open source risc-v

This has been going on for a long, long time. Motorola used to make Android phones that would burn an efuse in the SoC if it thought it was being rooted or jailbroken, bricking the phone.