Comment by tzury
1 month ago
We need both options to coexist:
1. Open, hackable hardware for those who want full control and for driving innovation
2. Locked-down, managed devices for vulnerable users who benefit from protection
This concept of "I should run any code on hardware I own" is completely wrong as a universal principle. Yes, we absolutely should be able to run any code we want on open hardware we own - that option must exist. But we should not expect manufacturers of phones and tablets to allow anyone to run any code on every device, since this will cause harm to many users.
There should be more open and hackable products available in the market. The DIY mindset at the junction of hardware and software is crucial for tech innovation - we wouldn't be where we are today without it. However, I also want regulations and restrictions on the phones I buy for my kids and grandparents. They need protection from themselves and from bad actors.
The market should serve both groups: those who want to tinker and innovate, and those who need a safe, managed experience. The problem isn't that locked-down devices exist - it's that we don't have enough truly open alternatives for those who want them.
Incorrect.
Choice 2. Empowered user. The end user is free to CHOOSE to delegate the hardware's approved signing solutions to a third party. Possibly even a third party that is already included in the base firmware such as Microsoft, Apple, OEM, 'Open Source' (sub menu: List of several reputable distros and a choice which might have a big scary message and involved confirmation process to trust the inserted boot media or the URL the user typed in...)
There should also be a reset option, which might involve a jumper or physical key (E.G. clear CMOS) that factory resets any TPM / persistent storage. Yes it'd nuke everything in the enclave but it would release the hardware.
I like the way Chromebooks do things, initially locking down the hardware but allowing you to do whatever if you intentionally know what you're doing (after wiping the device for security reasons). It's a pity that there's all the Google tracking in them that's near impossible to delete (unless you remove Chrome OS).
> I like the way Chromebooks do things, initially locking down the hardware but allowing you to do whatever if you intentionally know what you're doing
Did you hear? Google's not allowing "sideloading" (whitewashing the meaning of installing) third party apps by unknown developers.
> after wiping the device for security reasons
Think of the ~~children~~ data!
I wonder if full device wipe would be the solution to "annoying enough that regular users don't do it even when asked by a scam, but power users can and will definitely use it".
3 replies →
Consider the possibility of an evil maid type attack before a device is setup for the first time, e.g. running near identical iOS or macOS but with spyware preloaded, or even just adware.
We already have that today. And locked down systems don't prevent it, because you can always exploit some part of the supply chain. A determined actor will always find a path.
2 replies →
It's possible to make this detectable, and chromebooks already do.
On a chromebook, if you toggle to developer mode you get a nag screen on early-boot telling you it's in developer mode every time, and if you're not in developer mode you can only boot signed code.
Basically, just bake into device's firmware that "if any non-apple keys have been added, forcibly display 'bootloader not signed by Apple, signed by X'", and if someone sees that on a "new" device, they'll know to run.
With the root of trust and original software wiped, what used to be, say, an iPhone stops being an iPhone. It becomes a generic computer with the same hardware. All the software designed to run on iPhones like the App Store is likely to stop working. You won't fool the user for long.
And this attack is already doable by simply replacing the iPhone with a fake. It won't fool the user for long either, but you get to steal a real iPhone in exchange for a cheap fake.
This can be fixed by adding some user-controlled "fuse". For example, with a TPM you will lose access to stored keys if the boot sequence is modified.
You can have TPM with your own hardware key, which allow to verify the integrity of the BIOS. Works fine on my Librem laptop with a Librem Key.
Incorrect. For us as tech people this is an option. My older family members will definitely install malware and send all their data to China.
Please don’t let me go back to the early days of the internet where my mother had 50 toolbars and malware installed
> Please don’t let me go back to the early days of the internet where my mother had 50 toolbars and malware installed
I removed hundreds of toolbars from my mother/grandmother/anyone computer.
I still prefer that to techno-fascism where it's ok for companies to brick my hardware remotely, to lock me out of all my hardware because I have a picture of my kid in a bath, to read all my messages for whatever reason, to extract value from my personal files, pictures, musical tastes, to not allow me to install an app I bought because it have been removed from the store, to not allow me to install an app my friend created, to not allow me to create an app and sell it myself, to not allow me to not do the action ever but just "Later this week", and so on and so on.
This toolbar thing is a wrong excuse. And it was 90% because Windows was shitty.
Most mothers would have easily downloaded and installed crapware embedded with whatever they downloaded, but most mothers aren't doing to go to "Settings > About > Tap 10 times on OS version > Bootloader > Disable Bootloader protection > "Are you sure because your phone will become insecure ?" > Yes > Fucking yes.
And if they still do it to purposefully install malware, I'm sorry to say they are just stupid and I cannot care less about the toolbars.
1 reply →
This.
We need a mobile bill of rights for this stuff.
- The devices all of society has standardized upon should not be owned by companies after purchase.
- The devices all of society has standardized upon should not have transactions be taxed by the companies that make them, nor have their activities monitored by the companies that make them. (Gaming consoles are very different than devices we use to do banking and read menus at restaurants.)
- The devices all of society has standardized upon should not enforce rules for downstream software apart from heuristic scanning for viruses/abuse and strong security/permissions sandboxing that the user themselves controls.
- The devices all of society has standardized upon should be strictly regulated by governments all around the world to ensure citizens and businesses cannot be strong-armed.
- The devices all of society has standardized upon should be a burden for the limited few companies that gate keep them.
Keep in mind one of these third parties would almost certainly be Meta (because users want their stuff), and that would almost certainly be a privacy downgrade.
Freedom > Privacy > Security
Never give up your freedom.
If you have to give up your privacy to ensure your freedom, so be it.
If you have to give up your security to ensure your privacy, so be it.
This goes for governments and phones.
16 replies →
>big scary message
Open question:
Any idea on making it so difficult that grandma isn't even able to follow a phisher’s instructions over the phone but yet nearly trivial for anyone who knows what they’re doing?
Sure. You ship the device in open mode, and then doing it is easy. The device supports closed mode (i.e. whatever the currently configured package installation sources are, you can no longer add more), and if you put the device in closed mode, getting it back out requires attaching a debugger to the USB port, a big scary message and confirmation on the phone screen itself, and a full device wipe.
Then you put grandma's device in closed mode and explicitly tell her never to do the scary thing that takes it back out again and call you immediately if anyone asks her to. Or, for someone who is not competent to follow that simple instruction (e.g. small children or senile adults), you make the factory reset require a password and then don't give it to them.
6 replies →
Fix the phone system so calls must positively identify themselves.
There is no reason anyone purporting to be from a business or the government should be able to place a call without cryptographically proving their identity.
1 reply →
Stop gatekeeping actually useful apps. Nobody should never need to see the message to do anything they actually want to do, otherwise it leads to normalization of deviance.
False positives from PC virus scanners are very rare.
2 replies →
I'd argue that even the 'safe' devices should at least be open enough to delegate trust to someone besides the original manufacturer. Otherwise it just becomes ewaste once the manufacturer stops support. (Too often they ship vulnerable and outdated software then never fix it.)
If the user cannot be trusted to maintain the hardware and software, then the only responsible thing is to rely on the manufacturer to do so. In those cases, if the support is dropped you buy the newest device.
Paul knows that. He is arguing for a different future. google is about to remove my ability to remotely control my thermostat. Not even local control. Imagine a world where they would have to choose between continued device support or unlocking… or maybe just building out the local control and cleaning their hands of it. Having corpos as the arbiter of a consumers buying schedule and creating unnecessary easter is pretty undesirable.
2 replies →
https://news.ycombinator.com/item?id=45081344
Did they ask? Some users can be trusted. Is there even a certification program?
1 reply →
What if that is the newest device?
2 replies →
This is just insane. Lock the devices down by default, and allow the user to unlock them if they want. Why do we have to have Big Brother devices that "benevolently" restrict what you can run "for your own good"? Why can't all phones have unlockable bootloaders? My phone has a big, scary "DO NOT DO THIS UNLESS YOU'RE A COMPUTER EXPERT" warning screen to unlock the bootloader, and that's fine.
Why do we need devices we can't unlock? Who is harmed by unlocking? This is the major point nobody has ever been able to explain to me. Who exactly does the big scary unlocked bootloader hurt? My parents have unlockable devices and they haven't had all their money stolen, because they haven't unlocked them.
On Steam Deck, you never even have to set a 'sudo' password. You can have a safe managed experience and still allow a device to be open. Option 2 is ridiculous because it will just be exploited by companies and governments that want to control what you do or what content you see.
> The problem isn't that locked-down devices exist - it's that we don't have enough truly open alternatives for those who want them.
The problems is that vendors use "locked down devices" as an excuse to limit competition.
Suppose you have a "locked down" device that can only install apps from official sources, but "official sources" means Apple, Google, Samsung or Amazon. Moreover, you can disable any of these if you want to (requiring a factory reset to re-enable), but Google or Apple can't unilaterally insist that you can't use Amazon, or for that matter F-Droid etc.
Let the owner of the device lock it down as much as they want. Do not let the vendor do this when the owner doesn't want it.
The issue with this is that inevitably the locked down devices, which will end up being 98%+ of the market, become required for ordinary living, because no-one will develop for the 2%.
Open hardware is essentially useless if I need to carry both an open phone and a phone with the parking app, the banking app, messenger app to contact friends, etc.
For security reasons it makes sense for them to be different devices. People and services may not want to allow insecure devices to communicate with them.
Why? It's not like the insecure device doesn't have my identity key on it. If I program it to spam people, I go to jail for spamming.
2 replies →
> The problem isn't that locked-down devices exist - it's that we don't have enough truly open alternatives for those who want them.
Not for lack of trying. See for yourself
https://en.m.wikipedia.org/wiki/List_of_open-source_mobile_p...
The list is not short.
Plenty of companies have attempted this over the years but it’s not obvious that a big enough customer base exists to support the tremendous number of engineering hours it takes to make a phone. Making a decent smart phone is really hard. And the operations needed to support production isn’t cheap either.
Government maybe rather than legislating big companies stores could not back up smaller open HW/SW vendors? It seems we gave up increasing competition on HW and what is left is app store level...
I know you weren't using it in this way, but I do appreciate the double meaning of the word "protection" here.
A.k.a, "nice google account you've got there, holding all your memories, emails, contacts, and interface to modern living; would be a shame if something happened to it because you decided to sideload an app ..."
> Locked-down, managed devices for vulnerable users who benefit from protection
Thats fine! Just make sure it is possible for someone to take the same device and remove the locked down protections.
Make it require a difficult/obvious factory reset to enable, if you are concerned about someone being "tricked" into turning off the lockdown.
If someone wants baby mode on, all power too them! Thats their choice. Just like it should be everyone else's choice to own the same hardware and turn it off.
> Make it require a difficult/obvious factory reset to enable, if you are concerned about someone being "tricked" into turning off the lockdown.
Is there also a way to make it obvious to the user that a device is running non-OEM software? For example, imagine someone intercepts a new device parcel, flashes spyware on it, then delivers it in similar/the same packaging unbeknownst to the end user. The same could be said for second-hand/used devices.
It's potentially possible the bootrom/uefi/etc bootup process shows some warning for x seconds on each boot that non-OEM software is loaded, but for that to happen you need to be locked out of being able to flash your own bootrom to the device.
Pixel phones do this. Flashing a non-oem rom causes it to show a very "your device is broken" looking screen every time you boot.
No, we need to only have option 1, because if option 2 exists, things like banking apps will all only run on it and will refuse to work on option 1.
If there is a big enough market for 1), shouldn't it exist?
The problem in my eyes seems to be that there isn't enough capital interested to sufficiently fund 1) to compete and create a comparable product. Thus, at best, we end up with much inferior products which even people semi-interested in 1) are not willing to adopt due to the extreme trade offs in usability.
Regardless of whether we expect manufacturers to let us run any code on the device, we should not restrict people from attempting to bypass the manufacturers limitations. That gives the manufacturer freedom to try and lock the device down but also the owner freedom to break those locks. Otherwise it worsens situations like the FutureHome scandal.
You're wrong.
My hardware. My decision.
I don't think it will convince you in any way, but the whole point is/will be that it's not your hardware, you're paying for a perpetual license to use a terminal bound to someone else's service.
And it really shouldn't be this way. Everyone is tricked into believing that they own devices they bought. And we are somehow supposed to accept that the abilities of the device can be reduced after we bought it just because the vendor said so. Same with (lack of) right to repair. It's really not ok, nobody (especially here) should accept that.
1 reply →
Option 1 is a superset of option 2 - meaning, any hackable device can also be a locked down device because hackability means the power to do whatever.
We don't need option 2, period, and it shouldn't exist.
Just put the hackability behind a switch or something. If people turn it on, that's on them.
In theory these 2 options seem like a sensible way to have a choice. But the average user is not going to own and carry 2 devices. We want to have all we need in a single device, and things like paying with your phone have become way too common by now to not have them.
Agreed and I think we're already here. Hardware is so cheap now its trivial to have both multiple streaming devices and multiple open computer platforms. There are advantages to both and no way to compromise to have one device for everything.
Open and hackable products have a niche user base, so these users get a niche set of options. The only way to get mainstream products to play to this tiny user base is to demand that all products be open and hackable by fiat. Otherwise, there’s no incentive from anybody involved (manufacturers, app developers, etc.) to give them something that can run both their banking app and some open source app they compiled themselves. There’s a lot of dancing around the security effects this will have on “normies”, and although there are plenty of armchair proposals I haven’t heard one that doesn’t obviously degrade into some sort of alarm fatigue as both legitimate apps and malware tell you to click though a dialog or flip a setting.
I was a kid once. The hackability of the devices I owned is what led me to this career. Let's give our young ones a little more credibility.
You can have somr option burried in the settings, a 10yo kid would be able to think of this
I think this is a false dichotomy. Open hardware with open source software would be more protected simply by being more stress tested and vetted by more people. If you need even more protection you can employ zero-knowledge proofs and other trustless technologies. I have long been dreaming about some kind of hardware/software co-op creating non-enshittifying versions of thermostats, electric kettles, EV chargers, solar inverters, etc, etc. Hackable for people who want it, simply non-rent-seeking for everyone else.
The issue here is rarely whether the security features themselves are circumventable. It’s that at some point this turns into trusting users not to give malware apps permissions (whether that’s a dialog, a system wide setting, adding a third-party app store, etc.). Almost no users can usefully evaluate whether a particular bit of digital trust is a good or bad idea, so people will constantly get scammed in practice. If you’re thinking about ZNP as a solution, you’re not trying to solve the actual security problems of normal users.
I think normal users will figure it out if you give them a couple of generations
> more stress tested and vetted by more people
Grandma and grandpa aren't reading the source code and certainly not up at a professional level. This is one of the core misconceptions of the "free/libre" formulation of OSS.
> Grandma and grandpa aren't reading the source code and certainly not up at a professional level.
This is one of the core misconceptions of the anti "free/libre" formulation of OSS. Most users don't need to read the entire Debian source to know that it is safe to use. You are free to look up who maintains any part of the project and look at the history of changes that have been made. A lot of projects have nice, easy to read notes along with the actual code.
If you are so paranoid that you can't even trust open release notes then why would you trust a closed project at all?
8 replies →
I’m not suggesting grandpa reads code, contributors do. We all know that most commercial code is much shittier than open source. Sure, commercial code usually covers more edge cases and has better UX, but is cobbled together from legacy and random product asks.
9 replies →
People too stupid to use computers safely should be kept away from computers for their own safety. Giving that kind of person any kind of computer would be immoral by definition. They shouldn't have phones at all, they're just going to fall for corporate approved scams from Meta, Applovin, and Indian call centers.
Do we need the second option to exist? The world is dangerous place. If you can't figure out a computer perhaps you're just unfit to participate in the modern economy.
The existence of locked-down hardware eliminates the feasibility of open hardware through network effects. That is what is happening now.
You realize you’re discounting 98% of the world’s population, right?
I think that the majority of the population can figure out how to stop installing software from untrustworthy sources, seeing as that was pretty much the norm 20 years ago.
Everyone else can put on their loincloths and go back to living in flinstones-esque rock huts.
1 reply →
I think you just made up that number.
1 reply →
98% of the world population is rooting their phone and installing unsigned binaries? Really?
Are you sure you maybe don't have this the complete opposite way around?
2 replies →