Comment by mjevans

1 month ago

Incorrect.

Choice 2. Empowered user. The end user is free to CHOOSE to delegate the hardware's approved signing solutions to a third party. Possibly even a third party that is already included in the base firmware such as Microsoft, Apple, OEM, 'Open Source' (sub menu: List of several reputable distros and a choice which might have a big scary message and involved confirmation process to trust the inserted boot media or the URL the user typed in...)

There should also be a reset option, which might involve a jumper or physical key (E.G. clear CMOS) that factory resets any TPM / persistent storage. Yes it'd nuke everything in the enclave but it would release the hardware.

I like the way Chromebooks do things, initially locking down the hardware but allowing you to do whatever if you intentionally know what you're doing (after wiping the device for security reasons). It's a pity that there's all the Google tracking in them that's near impossible to delete (unless you remove Chrome OS).

  • > I like the way Chromebooks do things, initially locking down the hardware but allowing you to do whatever if you intentionally know what you're doing

    Did you hear? Google's not allowing "sideloading" (whitewashing the meaning of installing) third party apps by unknown developers.

    > after wiping the device for security reasons

    Think of the ~~children~~ data!

  • I wonder if full device wipe would be the solution to "annoying enough that regular users don't do it even when asked by a scam, but power users can and will definitely use it".

    • That's how bootloader unlocking has worked on Android phones for ages, and I've never heard of it being abused, so I think it's a good model.

      2 replies →

Consider the possibility of an evil maid type attack before a device is setup for the first time, e.g. running near identical iOS or macOS but with spyware preloaded, or even just adware.

  • We already have that today. And locked down systems don't prevent it, because you can always exploit some part of the supply chain. A determined actor will always find a path.

    • Right now you'd need a zero-day bootrom exploit to do something like this - still a possibility for the average high-level intelligence operative, but not the average white collar citizen. The proposal is making such a thing a feature.

      1 reply →

  • It's possible to make this detectable, and chromebooks already do.

    On a chromebook, if you toggle to developer mode you get a nag screen on early-boot telling you it's in developer mode every time, and if you're not in developer mode you can only boot signed code.

    Basically, just bake into device's firmware that "if any non-apple keys have been added, forcibly display 'bootloader not signed by Apple, signed by X'", and if someone sees that on a "new" device, they'll know to run.

  • With the root of trust and original software wiped, what used to be, say, an iPhone stops being an iPhone. It becomes a generic computer with the same hardware. All the software designed to run on iPhones like the App Store is likely to stop working. You won't fool the user for long.

    And this attack is already doable by simply replacing the iPhone with a fake. It won't fool the user for long either, but you get to steal a real iPhone in exchange for a cheap fake.

  • This can be fixed by adding some user-controlled "fuse". For example, with a TPM you will lose access to stored keys if the boot sequence is modified.

  • You can have TPM with your own hardware key, which allow to verify the integrity of the BIOS. Works fine on my Librem laptop with a Librem Key.

Incorrect. For us as tech people this is an option. My older family members will definitely install malware and send all their data to China.

Please don’t let me go back to the early days of the internet where my mother had 50 toolbars and malware installed

  • > Please don’t let me go back to the early days of the internet where my mother had 50 toolbars and malware installed

    I removed hundreds of toolbars from my mother/grandmother/anyone computer.

    I still prefer that to techno-fascism where it's ok for companies to brick my hardware remotely, to lock me out of all my hardware because I have a picture of my kid in a bath, to read all my messages for whatever reason, to extract value from my personal files, pictures, musical tastes, to not allow me to install an app I bought because it have been removed from the store, to not allow me to install an app my friend created, to not allow me to create an app and sell it myself, to not allow me to not do the action ever but just "Later this week", and so on and so on.

    This toolbar thing is a wrong excuse. And it was 90% because Windows was shitty.

    Most mothers would have easily downloaded and installed crapware embedded with whatever they downloaded, but most mothers aren't doing to go to "Settings > About > Tap 10 times on OS version > Bootloader > Disable Bootloader protection > "Are you sure because your phone will become insecure ?" > Yes > Fucking yes.

    And if they still do it to purposefully install malware, I'm sorry to say they are just stupid and I cannot care less about the toolbars.

    • Yes. So both options should be allowed to exist. One of them shouldn’t be banned because you don’t like it.

This.

We need a mobile bill of rights for this stuff.

- The devices all of society has standardized upon should not be owned by companies after purchase.

- The devices all of society has standardized upon should not have transactions be taxed by the companies that make them, nor have their activities monitored by the companies that make them. (Gaming consoles are very different than devices we use to do banking and read menus at restaurants.)

- The devices all of society has standardized upon should not enforce rules for downstream software apart from heuristic scanning for viruses/abuse and strong security/permissions sandboxing that the user themselves controls.

- The devices all of society has standardized upon should be strictly regulated by governments all around the world to ensure citizens and businesses cannot be strong-armed.

- The devices all of society has standardized upon should be a burden for the limited few companies that gate keep them.

Keep in mind one of these third parties would almost certainly be Meta (because users want their stuff), and that would almost certainly be a privacy downgrade.

  • Freedom > Privacy > Security

    Never give up your freedom.

    If you have to give up your privacy to ensure your freedom, so be it.

    If you have to give up your security to ensure your privacy, so be it.

    This goes for governments and phones.

    • Always fun to interact with some internet Thomas Jefferson giving freedom speeches from his mother's basement.

      Reality is that people pay a lot of money because they 'trust' Apple (and to a lesser extent Google), but Meta is the sleaziest one of them all. (And I don't use their shit either.) But people want Whatapp and Instagram, and so you are telling them now they have sell-out and go to the "Meta App Store" to talk to their friends. That fucking sucks. And I think you agree with that.

      7 replies →

    • > This goes for governments and phones.

      Apple does not have the ability to throw me in prison or take away my freedoms. Only to not grant me extra freedoms subsidized by their R&D budget.

      7 replies →

>big scary message

Open question:

Any idea on making it so difficult that grandma isn't even able to follow a phisher’s instructions over the phone but yet nearly trivial for anyone who knows what they’re doing?

  • Sure. You ship the device in open mode, and then doing it is easy. The device supports closed mode (i.e. whatever the currently configured package installation sources are, you can no longer add more), and if you put the device in closed mode, getting it back out requires attaching a debugger to the USB port, a big scary message and confirmation on the phone screen itself, and a full device wipe.

    Then you put grandma's device in closed mode and explicitly tell her never to do the scary thing that takes it back out again and call you immediately if anyone asks her to. Or, for someone who is not competent to follow that simple instruction (e.g. small children or senile adults), you make the factory reset require a password and then don't give it to them.

    • Very nice!

      I’m sure I’m missing a problem with the following approach: shipping in _closed_ mode with a sticker on the front notifying the person they should do a factory reset immediately to make sure they can do everything they want to do. During the reset, include a scary message for those who opt in to get to open mode.

      Everyone simply goes by defaults so it would only be technical people presumably who would even get into the open mode in the first place. And then require the debugger to leave closed mode like you said.

      Edit: this comment worries about solo/asocial/“orphaned” members of our society

      1 reply →

    • Make it an obscure option in the first time setup so all the users that click next next next will end up with the secure mode, while the open mode requires fiddling.

      This isn’t a gdpr opt out where both alternatives need to be equally easy. We (as a society) absolutely need the devices to default to the current model when purchased.

      3 replies →

  • Fix the phone system so calls must positively identify themselves.

    There is no reason anyone purporting to be from a business or the government should be able to place a call without cryptographically proving their identity.

    • I like that! I’m sure it would take a little bit of time for folks to stop trusting calls from personal numbers where highly-capable social engineers do their best work, but eventually I expect nearly all of us would learn the lesson.

      And presumably we could set up notifications so our elderly relatives’ phones would alert us to calls from unverified numbers not in their contact list lasting longer than a minute or two.

  • Stop gatekeeping actually useful apps. Nobody should never need to see the message to do anything they actually want to do, otherwise it leads to normalization of deviance.

    False positives from PC virus scanners are very rare.

    • Interesting, mind elaborating a bit/clarifying the first couple of sentences there? A point I’d like to understand

    • What are you on about? The last 10 years of computing the only time windows defender pinged was on false positives.