← Back to context

Comment by fimdomeio

20 days ago

You don't have to prevent root access. You just have to inform user of the risks, void warranties if you want but let users do whatever they want with the hardware that they own.

> "void warranties if you want "

Please don't push the Overton Window any further. Installing my own software on my own PC should never void the hardware vendor's warranty. That delegitimizes the core concept of a PC.

(A horrific possible dystopia just flashed through my mind: "I'd love to throw out Chrome and install Firefox so that I could block ads, but, the laptop is expensive, and I can't afford voiding the warranty". I bet Google would *love* that world. Or, a UK version: "I'd love to use a VPN, but, regulation banned them from the approved software markets, and anything else would permanently set the WARRANTY_VIOLATED flag in the TPM").

  • This is where it's heading, and I see this as the real driving force behind secure boot on x86_64.

  • It depends on what your software does; if it removes hardware protections then your warranty should be voided. Of course, those protections are either hardware or impossible to remove, like emergency cooling / lowering voltage when stuff overheats.

> You just have to inform user of the risks

Warnings aren't always enough, sometimes we have to lock people down and physically prevent them from harming themselves.

It's not always people being stupid. I recall reading an article by someone who got scammed who seemed generally quite knowledgeable about the type of scam he fell for. As he put it, he was tired, distracted, and caught at the right time.

Outside of that, a lot of the general public have a base assumption of "if the device lets me do it, it's not wrong," and just ignore the warnings. We get so many stupid pop-ups, seemingly silly warning signs (peanuts "may contain nuts") that it's easy to dismiss this as just one example of the nanny state gone mad.

  • Please read again the sentence you just typed.

    > We have to lock people down and physically prevent them from harming themselves.

    You can apply this argument to literally anything, and taken to its logical conclusion, this is exactly what will happen.

    • > sometimes we have to lock people down and physically prevent them from harming themselves

      I highlighted the word you missed, deliberately in my opinion, as it completely changes the meaning to exclude your frankly idiotic assertion.

      1 reply →

  • > sometimes we have to lock people down and physically prevent them from harming themselves

    Seriously ill people as an exceptional last resort though, right? Or just everyone?

    • I’ll take a real world example where I watched someone start to climb over the side of a bridge. Luckily my words stopped him but I did consider whether I should pull him back and pin him to the ground for his own good.

      Is your position that it would be better for his freedom for me to let him jump if I couldn’t dissuade him?

      3 replies →

Even if it's illegal? (like transmitting on forbidden frequencies)

It's not always the user who's installing software. Lots of people depend on other people to manage their devices. Manufacturers like the hardware they delivered to be trusted so users trust it regardless of who handled it.

  • I always hear as the excuse but it is ridiculous. If the user wants to transmit on "illegal" frequencies, all he has to do is to change the country setting in their Wi-Fi router, et voilà, illegal transmissions.

    The entire Android OS has about as much access to radios than your average PC, if not less. In fact, even on recent android devices, wireless modems still tend to show up to the OS as serial devices speaking AT (hayes) (even if the underlying transport isn't, or even if the baseband is in the same chip). Getting them to transmit illegal frequencies is as much easy or hard as is getting a 4G USB adapter to do it.

  • At least in EU, transmitting is illegal, having hardware to transmit is not.

    That's why people can buy TX/RX SDRs and Yaesu transceivers without a license.

    AFAIK the radioamateur world, serious violations of frequency plans are rare and are usually quickly handled by regulators. OTOH, everyone is slightly illegal, e.g. transmitting encrypted texts or overpowering their rigs, but that's part of the fun.

    • And in some locations, quickly handled by the local amateur community, with foxhunts and community outreach to frequency violators - only getting regulators involved when just talking to the offenders fails.

  • > Even if it's illegal? (like transmitting on forbidden frequencies)

    That's not relevant here. If frequencies are illegal, it should be impossible to program it in such a way. But even otherwise, it's the responsibility of the user to follow local laws. If I have a PTT phone, it's not legal for me to use forbidden frequencies just because it's possible. Why do these manufacturers care about what doesn't concern them when they violate even bigger laws all the time?

    > It's not always the user who's installing software. Lots of people depend on other people to manage their devices.

    That should be up to the user. Here we are talking about users who want to decide for themselves what their device does. You're talking as if giving the user that choice is the injustice. Nope. Taking away the choice is.

    > Manufacturers like the hardware they delivered to be trusted so users trust it regardless of who handled it.

    I see what you did here. But here is the thing. Securing a device is not antithetical to the user's freedom. That was what secure boot chain was originally supposed to accomplish until Microsoft managed to corrupt it into a tool for usurping control from the user.

    Manufacturer trust is a farce. They should be deligating that trust to the user upon the sale of the device, through well proven concepts as explained above. They chose to distrust the user instead. Why? Greed!

    • > If frequencies are illegal, it should be impossible to program it in such a way.

      You know there's a very fine line between hardware and software in this case so you're actually advocating for drm like control here.

      > They should be deligating that trust to the user upon the sale of the device, through well proven concepts as explained above.

      That same user who forgets passwords and recovery keys all the time and loses all access to documents when a device breaks? And you're presuming giving that kind of person who doesn't understand sh*t about backups, device security etc full access to their devices will not result in a lot of compromised devices?

      I'm not sure manufacturers are the best party to trust but they have an interest in a secure reputation, which the majority of dumb users or eavesdropping governments do not have.

      > They chose to distrust the user instead. Why? Greed!

      There are more reasons to distrust the user. I don't buy greed is the only relevant one.

      1 reply →

  • Especially if it's illegal (like speaking against the government, in some countries).

    Maybe this is a bit of a hot take, but I think any government that has the ability to absolutely prevent people from breaking the law is a government with far too much power. I'm all in favor of law enforcement, but at some point it starts to cross over the line from enforcement to violation of people's free will.

Yes, very clear warnings; I could live with a small permanent icon in the status bar (via the GPU firmware) etc. But absolutely should not void warranties (overclocking might but never just root).

  • If you can't destroy your own hardware by rooting, do you have true root access?

    • Easy enough to have an efuse blow if you overvolt; then an dificult conversion on a warrenty claim. Whilst ideologically this is ceeding some control I can accept it.

I don't think users understand the risks. I'm broadly accepting of the protection of end users through mechanisms. Peoples entire lives are managed through these small devices. We need much better sandboxing to almost create a separate 'VM' for critical apps such as banking and messaging.

The problem is Dunning Kruger effect.

The people who shouldn't disable these security features tend to be the first to do so. And then complain the loudest when the enter the "find out" phase.