Comment by hnrayst

16 days ago

[flagged]

Both of your comments here, posted just one minute apart yet with completely different content, reek of LLM output.

Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.

What’s so hard to make 2-3 pins and each to access different logged in apps and files.

If Apple/android was serious about it would implement it, but from my research seems to be someone that it’s against it, as it’s too good.

I don’t want to remove my Banking apps when I go travel or in “dangerous” places. If you re kidnapped you will be forced to send out all your money.

  • Absolutely every aspect of it?

    What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?

    What could be hard about that?

    • > despite an insignificant fraction of customers using such a feature?

      Isn't that the exact same argument against Lockdown mode? The point isn't that the number of users is small it's that it can significantly help that small set of users, something that Apple clearly does care about.

      4 replies →

    • Maybe one PIN could cause the device to crash. Devices crash all the time. Maybe the storage is corrupted. It might have even been damaged when it was taken.

      This could even be a developer feature accidentally left enabled.

    • It doesn't seem fundamentally different from a PC having multiple logins that are accessed from different passwords. Hasn't this been a solved problem for decades?

      6 replies →

    • iPhone and macOS are basically the same product technically. The reason iPhone is a single user product is UX decisions and business/product philosophy, not technical reasons.

      While plausible deniability may be hard to develop, it’s not some particularly arcane thing. The primary reasons against it are the political balancing act Apple has to balance (remember San Bernardino and the trouble the US government tried to create for Apple?). Secondary reasons are cost to develop vs addressable market, but they did introduce Lockdown mode so it’s not unprecedented to improve the security for those particularly sensitive to such issues.

      1 reply →

    • You think iPhones aren’t multi-user for technical reasons? You sure it’s not to sell more phones and iPads? Should we ask Tim “buy your mom an iPhone” Cook?

  • > Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.

    > What’s so hard to make 2-3 pins and each to access different logged in apps and files.

    Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.

  • It's more a policy problem than a phone problem. Apple could add as many pins as they want but until there are proper legal based privacy protections, law enforcement will still just be like "well how do we know you don't have a secret pin that unlocks 40TB of illegal content? Better disappear you just to be sure"

    For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.

    Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.

    Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"

  • How does "go to prison for not showing" work when a lot of constitutions have a clause for a suspect not needing to participate in their own conviction / right to remain silent?

    A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.

    • It does mean that. You can't be forced to divulge information in your head, as that would be testimonial. But if there are papers, records, or other evidentiary materials that are e.g. locked in a safe you can be compelled to open it with a warrant, and refusal would be contempt.

      6 replies →

  • Assuming the rule of law is still functioning, there are multiple protections for journalists who refuse to divulge passwords in the USA. A journalist can challenge any such order in court and usually won't be detained during the process as long as they show up in court when required and haven't tried to destroy evidence.

    Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.

    • I think it's pretty clear at this point that rule of law isn't functioning. Perhaps it never was. It was just rule of law theater.

  • There is no plausible deniability here, that's only relevant in a rule-of-law type of situation, but then you wouldn't need it as you can't be legally compelled to do that anyway. "We don't see any secret source communication on your work device = you entered the wrong pin = go think about what your behavior in jail"

  • Even if this worked (which would be massively expensive to implement) the misconfiguration possibilities are endless. It wouldn't be customer-centric to actually release this capability.

    Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)

  • Completely separate decision with a higher legal bar for doing that.

    It's one thing to allow police to search a phone. Another to compel someone to unlock the device.

    We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.

  • “Plausible deniability” is a public relations concept. It doesn’t confer any actual legal protection.

    • It absolutely offers some legal protection. If it is implemented correctly, no legal framework for it is required. Government forces you to enter your password. You comply and enter "a" password. The device shows contents. You did what you were asked to do. If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear. Done correctly (in device and OPSEC) government can't prove you entered your decoy password so you can't be held in contempt. And that is the entire point. It is not like asking the government to give your "plausible deniability" rights. It is about not potentially incriminating yourself against people that abuse the system to force you to incriminate yourself.

      2 replies →

  • Yep, you need an emergency mode that completely resets the phone to factory settings, maybe triggered with a decoy pin. Or a mode that physically destroys the chip storing the keys

  • I always wondered if this was the feature of TrueCrypt that made it such a big target. LUKS is fine, I guess, but TrueCrypt felt like actual secrecy.

  • You do not. We have this thing in our constitution called the 5th amendment. You cannot be forced to divulge the contents of your mind, including your pin or passwords. Case law supports this. For US citizens at least. Hopefully the constitution is still worth something.

  • Why are you on a website for programmers and software developers if you arent a software developer and you know nothing of the subject?

Serious question: What are the "valid concerns" about people securing their computing devices against third parties?

  • This (I think) refers not to the people securing their devices against third parties but the vendors "securing" the devices against loss of profits.

    Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.

    ___

    _Ideally_ you wouldn't need to trust Apple as a corp to do the right thing. Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.

    That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.

    For completeness, No, I do not know either how this could be implemented differently.

    • > Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.

      Both goals actually are possible to implement at the same time: Secure/Verified Boot together with actually audited, preferably open-source, as-small-as-possible code in the boot and crypto chain, for the user, the ability to unlock the bootloader in the EFI firmware and for those concerned about supply chain integrity, a debug port muxed directly (!) to the TPM so it can be queried for its set of whitelisted public keys.

      2 replies →

    • We don't know if they did the right thing here. With a previous case it seemed (to me) like Apple might have pushed an update to give access ... they presumably could do that, remotely copy all the data, then return the device to the former state. One can't know, and this sort of thing seems entirely tenable.

      FBI don't have to tell anyone they accessed the device. That maintains Apples outward appearance of security; FBI just use parallel construction later if needed.

      Something like {but an actually robust system} a hashed log, using an enclave, where the log entries are signed using your biometric, so that events such a network access where any data is exchanged are recorded and can only be removed using biometrics. Nothing against wrench-based attacks, of course.

      4 replies →

  • One valid concern about "locked down computing" is the potential for 3rd parties to secure computing devices against their owners.

  • In this case I think "valid concerns about locked down computing" is referring to the owner's use of the phone being restricted, so that they can't download applications they want to use, they don't have unrestricted access to the filesystem, they are forced to pay an Apple commission to engage in certain forms aloft commerce, etc. These may be acceptable tradeoffs but they're valid concerns nonetheless.

  • I don't have to have any concern to be able to secure my device against third parties, it's just good operational discipline.

    I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.

    That's all.

    It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.

    Security is not something we shall earn, but shall have at the highest level by default.

  • Lockdown mode significantly effects the usability of the phone.

    It completely disables JIT js in Safari for example.

    • I do have it enabled and webbrowsing is still fine, the things I use are or websites or simple web apps that aren't javascript heavy anyway...

      when I want to do something for longer I will pickup my MacBook anyway.

  • Pegasus.

    Jedi.

    SKyWIper.

    Rogue Actors.

    Rogue thief’s.

    Rogue governments.

    Your spouse.

    Separating corporate IT from personal IT.

    There’s plenty of reasons.

  • Oh, come on. Don't look at another man's Portal Gun history. We all go to weird places.

  • I get so annoyed by this Socratic line of questioning because it’s extremely obvious.

    Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.

    But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.

    So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.

    Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.

    • The better rational counter argument is that "privacy is a human right enshrined in international law". Society has zero business knowing anyone's private communications, whether or not that person is a terrorist. There is nothing natural about being unable to talk to people privately without your speech being recorded for millions of people to view forever. Moreover, giving society absolute access to private communications is a short road to absolute dystopia as government uses it to completely wipe out all dissent, execute all the Jews or whatever arbitrary enemy of the state they decide on, etc.

      You do not get to dispense with human rights because terrorists use them too. Terrorists use knives, cars, computers, phones, clothes... where will we be if we take away everything because we have a vested interested in denying anything a terrorist might take advantage of?

      9 replies →

    • This means there are no valid concerns.

      There are just things some people want and the reasons they want them.

      So the question that you are so annoyed by remains unanswered (by you anyway), and so, valid, to all of us adults.

      @hypfer gives a valid concern, but it's based on a different facet of lockdown. The concern is not that the rest of us should be able to break into your phone for our safety, it's the opposite, that you are not the final authority of your own property, and must simply trust Apple and the entire rest of society via our ability to compel Apple, not to break into your phone or it's backup.

    • At the risk of being kind of ass, which I've been trying to be better about lately, I'm going to offer some advice. If you can't even respond to a question about secure computing without bringing American presidential politics into things, perhaps you need to take a break from the news for a few weeks.

      The reason I asked that question is because I don't think it's complicated. I should be able to lock down my device such that no other human being on the planet can see or access anything on it. It's mine. I own it. I can do with it whatever I please, and any government that says otherwise is diametrically opposed to my rights as a human being.

      You are more likely to be struck by lightning while holding two winning lottery tickets from different lotteries than you are to be killed by an act of terrorism today. This is pearl-clutching, authoritarian nonsense. To echo the sibling comment, society does not get to destroy my civil rights because some inbred religious fanatics in a cave somewhere want to blow up a train.

      Edit: And asking for someone to says "there are concerns!" to proffer even a single one is not a Socratic line of questioning, it's basic inquiry.

      2 replies →

    • > I get so annoyed by this Socratic line of questioning because it’s extremely obvious.

      Yeah after seeing the additional comments, my gut also says "sea lion".

      Truly a shame

    • > ...the Trump admin has gleefully flouted norms around that designation...

      One would have to hold a fairly uninformed view of history to think the norms around that designation are anything but invasive. The list since FDR is utterly extensive.

      4 replies →

  • Some platforms will side-load anything the telecom carrier sends.

    It is naive to assume iOS can be trusted much more than Android. =3

> It's a real world example of how these security features aren't just for "paranoid people" but serve a legit purpose for people who handle sensitive info.

Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes

In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.

It just depends on how much a government wants the data that is stored there.

  • Which countries actually grant reporters immunity from having to reveal information related to criminal investigations (where others would be compelled to, and without criminal penalties)? Such immunity may be desirable (at least in some circumstances), but I am not aware of any jurisdiction that actually grants it.

    • At least in Finland there's a specific law about journalistic source protection (lähdesuoja) explicitly saying journalists have the right to not reveal sources.

      In serious crime cases in some circumstances a court may order a journalist to reveal sources. But it's extremely rare and journalists don't comply even if ordered.

      https://fi.wikipedia.org/wiki/L%C3%A4hdesuoja

      Edit: the source protection has actually probably never been broken (due to a court order at least): https://yle.fi/a/3-8012415

      1 reply →

Indeed, likely as secure as the VPNs run by intelligence contractors.

1. iOS has well-known poorly documented zero-click exploits

2. Firms are required to retain your activity logs for 3 months

3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.

It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3

  • > 1. iOS has well-known poorly documented zero-click exploits

    PoC || GTFO, to use the vernacular.

    If you're talking about historical bugs, don't forget the update adoption curves.

    • No one will hand over the several $1m 0-day as PoC for free, as there are grey-market products based on the same tired exploits.

      "Not My Circus, Not My Monkeys" as they say. =3

      1 reply →

With the US descending more and more into fascism (as this case highlights yet again), I wonder what will happen to these features in the future. Especially now that the tech moguls of silicon valley stopped standing up to Trump and instead started kissing his ass. Tim Cook in particular seems to be the kind of person that rather is on the rich side of history than the right side. What if the administration realizes they can easily make Apple et al. give up their users by threatening their profits with tariffs and taxes?

Apple seems to strongly discourage the use of lockdown mode. Presumably it is in conflict with their concern over share price and quarterly earnings.

  • How do they discourage it? It’s a clearly-labeled button in the Settings app, which brings up one modal sheet explaining what will change if you turn it on, then one more button press and it’s on.

  • Citation needed?

    Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.

    • Indeed. It maybe the best reason to use their products, but then why not make it default or do more to encourage its use?

  • Didn’t they make it?

    • Is it supported in iOS 18? They seem to suggest in their own documentation that very few people need or should use it. They could do much more to encourage and support its use. Even the naming “lockdown” vs “secure” is a big tell.