The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
> I believe secrets are still encrypted at this stage similar to cold boot.
Does this mean that the Signal desktop application doesn't lock/unlock its (presumably encrypted) database with a secret when locking/unlocking the laptop?
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.
There's no known technique to force you to input a password.
Compelled speech is protected, fingerprints aren't.
Imagine it's 1926 and none of this tech is an issue yet. The police can fingerprint and photograph you at intake, they can't compel speech or violate the 5th.
That's exactly what's being applied here. It's not that the police can do more or less than they could in 1926, it's that your biometrics can do more than they did in 1926. They're just fingerprinting you / photographing you .. using your phone.
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
Everyone makes this same comment on each of these threads, but it's important to remember this only works if you have some sort of advance warning. If you have the iPhone in your hand and there is a loaded gun pointed at your head telling you not to move, you probably won't want to move.
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
One thing I miss from windows (on mac now) is there was an encrypted vault program that you could have hide so it wasn't on the desktop or program list but could still be launched. That way you could have private stuff that attackers would likely not even know was there.
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
Plausible deniability still works. You enter your duress code and your system boots to a secondary partition with Facebook and Snapchat. No such OS exists.
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
I’m with you on the shared photo albums. I’d been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one I’d like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.
The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when it’s seized.
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
The nso group is on the entity list, so no western govt is using it. And it was never used to gain access to devices that they already had physical control over.
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
My read on this is that she tried to bluff, even though the odds were astronomically high that they'd call her on it. She didn't have anything to lose by trying a little white lie. It's what I would have done in the same situation, anyway.
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
RAM encryption doesn’t prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; I’m talking about SoC crypto engines like those found in Apple’s M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldn’t trust any secure enclave never to be breached – that’s an impossible promise to make even though it would require a nation-state level attack – but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.
I recommend reading the AES-XTS spec, in particular the “tweak”. Or for AES-GCM look at how IV works.
I also recommend looking up PUF and how modern systems use it in conjunction with user provided secrets to dervie keys - a password or fingerprint is one of many inputs into a kdf to get the final keys.
The high level idea is that the key that's being used for encryption is derived from a very well randomized and protected device-unique secret setup at manufacturing time. Your password/fingerprint/whatever are just adding a little extra entropy to that already cryptographically sound seed.
Tl;dr this is a well solved problem on modern security designs.
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
In China, there is only one way to deal with this situation: when the police summon you for the first time, do not bring your phone. Before the second summons, get a new phone or completely format your old one. However, this does not apply in cases of ongoing crimes or when someone is already wanted by the authorities, as they will not be given a second chance.
Depending on your jurisdiction faceid is safer than fingerprint, because faceid won’t unlock while your eyes are closed.
In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. It’s to reduce the attack surface. The average user wouldn’t want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you haven’t called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably won’t help their cybersecurity as they wouldn’t be targeted anyway.
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues
Perhaps this could be a factor in why it's not a default setting
Can anyone speak to the relative safety or lack thereof using FaceID on individual apps while requiring a PIN to login to the device?
I have my phone setup this way because FaceID can be so convenient. I know it opens up more attack vectors than not using it but is it possible for a powerful actor to utilize the fact that it is enabled at all to gain access to a locked phone?
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
My Google pixel 5a randomly requires the pin/password every couple of days and will not accept biometrics. I have always assumed this was to heavily discourage using long passwords for this very reason.
It's unlikely that Pegasus would work since Apple patched the exploit it used.
I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
Every time I see these articles about iphones posing trouble for authorities, I always think of it as free (and fraudulent) advertisement.
I could be naive, but just don't think they'd really have any difficulty getting what they needed. Not that I give a fuck, but I guess I've seen one too many free ads.
What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?
> Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.
It's more a policy problem than a phone problem. Apple could add as many pins as they want but until there are proper legal based privacy protections, law enforcement will still just be like "well how do we know you don't have a secret pin that unlocks 40TB of illegal content? Better disappear you just to be sure"
For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.
Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.
Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"
How does "go to prison for not showing" work when a lot of constitutions have a clause for a suspect not needing to participate in their own conviction / right to remain silent?
A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.
Assuming the rule of law is still functioning, there are multiple protections for journalists who refuse to divulge passwords in the USA. A journalist can challenge any such order in court and usually won't be detained during the process as long as they show up in court when required and haven't tried to destroy evidence.
Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.
There is no plausible deniability here, that's only relevant in a rule-of-law type of situation, but then you wouldn't need it as you can't be legally compelled to do that anyway. "We don't see any secret source communication on your work device = you entered the wrong pin = go think about what your behavior in jail"
Even if this worked (which would be massively expensive to implement) the misconfiguration possibilities are endless. It wouldn't be customer-centric to actually release this capability.
Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)
Completely separate decision with a higher legal bar for doing that.
It's one thing to allow police to search a phone. Another to compel someone to unlock the device.
We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.
Yep, you need an emergency mode that completely resets the phone to factory settings, maybe triggered with a decoy pin. Or a mode that physically destroys the chip storing the keys
You do not. We have this thing in our constitution called the 5th amendment. You cannot be forced to divulge the contents of your mind, including your pin or passwords. Case law supports this. For US citizens at least. Hopefully the constitution is still worth something.
This (I think) refers not to the people securing their devices against third parties but the vendors "securing" the devices against loss of profits.
Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc.
If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.
___
_Ideally_ you wouldn't need to trust Apple as a corp to do the right thing.
Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.
That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.
For completeness, No, I do not know either how this could be implemented differently.
In this case I think "valid concerns about locked down computing" is referring to the owner's use of the phone being restricted, so that they can't download applications they want to use, they don't have unrestricted access to the filesystem, they are forced to pay an Apple commission to engage in certain forms aloft commerce, etc. These may be acceptable tradeoffs but they're valid concerns nonetheless.
I don't have to have any concern to be able to secure my device against third parties, it's just good operational discipline.
I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.
That's all.
It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.
Security is not something we shall earn, but shall have at the highest level by default.
I get so annoyed by this Socratic line of questioning because it’s extremely obvious.
Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.
But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.
So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.
Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.
> It's a real world example of how these security features aren't just for "paranoid people" but serve a legit purpose for people who handle sensitive info.
Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes
In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.
It just depends on how much a government wants the data that is stored there.
Which countries actually grant reporters immunity from having to reveal information related to criminal investigations (where others would be compelled to, and without criminal penalties)? Such immunity may be desirable (at least in some circumstances), but I am not aware of any jurisdiction that actually grants it.
Indeed, likely as secure as the VPNs run by intelligence contractors.
1. iOS has well-known poorly documented zero-click exploits
2. Firms are required to retain your activity logs for 3 months
3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.
It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3
With the US descending more and more into fascism (as this case highlights yet again), I wonder what will happen to these features in the future. Especially now that the tech moguls of silicon valley stopped standing up to Trump and instead started kissing his ass. Tim Cook in particular seems to be the kind of person that rather is on the rich side of history than the right side. What if the administration realizes they can easily make Apple et al. give up their users by threatening their profits with tariffs and taxes?
How do they discourage it? It’s a clearly-labeled button in the Settings app, which brings up one modal sheet explaining what will change if you turn it on, then one more button press and it’s on.
Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.
Trick is not to use your right index finger as a biometric unlock finger (the button sits on the top right corner of the keyboard). If you are "forced" to unlock, the agents will guide your fingers and probably try that first 2-3 times. 2 more tries, and fingerprint reading gets disabled. Quite good odds.
It's interesting because the latest Cellebrite data sheets showed them to support all iPhones including e.g. unbooted, but apparently not lockdown mode? It also showed they hadn't cracked GrapheneOS.
Wait, was this an oversight on his part about the biometric unlock? My MacBook biometric gets disabled after a bit and requires a password if the lid was closed for substantial amount of time.
Does anyone know if iOS in lockdown mode stops syncing mail, imessage, call history etc to your other apple devices? I am wondering if reporter's stuff was all synced to the non lockdown MacBook from the iPhone
The warrant is the force, current jurisprudence largely says warrant do compel people to provide biometric unlocks because it's not speech the same way giving up a password/passcode would be. Blocking or not complying with a signed warrant from a judge is it's own crime and the only safe way to fight them is with a lawyer in court not with the officer holding the paper (and gun/taser/etc with the power of the state behind them).
What do you think warrants are? You think they get a warrant and they say, "Can you put your finger on the device?" You say, "No," and that's it? If all they wanted to do was ask you, they would just ask you without the warrant.
Do you disagree with the facts of the article? Or is it propaganda simply because the facts doesn't support your narrative and ideological inclinations?
Selective amplification of true events as well as selective reporting are bread and butter of modern propaganda. It works a lot better than saying outright falsehoods, which - in the long-term - cause people to lose faith in everything you have to say. And there's always someone jumping to your defense - after all you did not outright lie...
Man people are whiny about this on Hacker News when they should know better. There is no real computer security without hardware roots of trust and keystores
Note that these are not crackable only if you have a strong password (random one will work). Unlike on phones, there is nothing slowing down brute force attempts, only the comparatively much weaker PBKDFs if you use a password. You want at least about 64 bits of entropy, and you should never use that password anywhere else, since they would basically run "strings" on your stuff to attempt the brute force.
Worse than that most phones are using smart enclave like chips protected by a 4 digit PIN that can be voltage drained to try every combo without a wipe.
> ---- All above is pure fantasy and never happened, as you probably have already guessed.
Ah, while I was a bit suspicious, I thought it might be real (weirdly worded). What exactly is the point of fabricating this?- Is there a joke I'm blind to?
No joke, it is just I don't like to leave any trail about law issues, even if it is hardly a menace. This last sentence is for law enforcement in the really hard to imagine case it might be relevant sometime.
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.
https://archive.is/1ILVS
Remember...they can make you use touch id...they can't make you give them your password.
https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
Good reminder to also set up something that does this automatically for you:
https://news.ycombinator.com/item?id=46526010
2 replies →
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
There are trillionaires?
5 replies →
[flagged]
31 replies →
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
2 replies →
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
11 replies →
I previously commented a solution to another problem, but it assists here too:
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
> I believe secrets are still encrypted at this stage similar to cold boot.
Does this mean that the Signal desktop application doesn't lock/unlock its (presumably encrypted) database with a secret when locking/unlocking the laptop?
1 reply →
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
16 replies →
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
> they can't make you give them your password.
Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/
75 footnotes for 89 sentences, nice! I guess that's how they roll over at the HLR.
I don't get why I can be forced to use my biometrics to unlock but I cannot be forced to give a pin. Doesn't jive in my brain.
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.
There's no known technique to force you to input a password.
4 replies →
Compelled speech is protected, fingerprints aren't.
Imagine it's 1926 and none of this tech is an issue yet. The police can fingerprint and photograph you at intake, they can't compel speech or violate the 5th.
That's exactly what's being applied here. It's not that the police can do more or less than they could in 1926, it's that your biometrics can do more than they did in 1926. They're just fingerprinting you / photographing you .. using your phone.
The fifth amendment gives you the right to be silent, but they didn't write in anything about biometrics.
"technicality" or "loophole" is probably the word.
I fully agree, forced biometrics is bullshit.
I say the same about forced blood removal for BAC testing. They can get a warrant for your blood, that's crazy to me.
[dead]
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
Remember, this isn't how it works in every country.
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
6 replies →
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
2 replies →
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
5 replies →
Everyone makes this same comment on each of these threads, but it's important to remember this only works if you have some sort of advance warning. If you have the iPhone in your hand and there is a loaded gun pointed at your head telling you not to move, you probably won't want to move.
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
15 replies →
Alternately, hold the power button and either volume button together for a few seconds.
1 reply →
One thing I miss from windows (on mac now) is there was an encrypted vault program that you could have hide so it wasn't on the desktop or program list but could still be launched. That way you could have private stuff that attackers would likely not even know was there.
As far as I know lockdown mode and BFU prevent touch ID unlocking.
At least a password and pin you choose to give over.
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
And threats aren't illegal. They can put a gun to wife's head and say they're going to shoot. It's up to you then to call their bluff.
Plausible deniability still works. You enter your duress code and your system boots to a secondary partition with Facebook and Snapchat. No such OS exists.
1 reply →
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...
https://xkcd.com/538/
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
Is there a way to setup Mac disabling Touch ID if the linked phone goes into lockdown or Face ID requires passcode? Apple could probably add that.
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
I’m with you on the shared photo albums. I’d been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one I’d like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.
The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
Agreed. If I know my threat model, I don’t need unnecessary restrictions.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
I'll bite. Why is it so terrible? I'm browsing this site right now on my phone and don't see the horror.
6 replies →
I think that ship has sailed.
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop
Educate us. What makes it less secure?
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when it’s seized.
1 reply →
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
1 reply →
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
4 replies →
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
Not if she’s smart.
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
2 replies →
Bitlocker isn't secure, for several reasons, that I won't get into on here.
breached
1 reply →
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
It's relatively well know that the NSO Group / Pegasus is what governments use to access locked phones.
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
2 replies →
The nso group is on the entity list, so no western govt is using it. And it was never used to gain access to devices that they already had physical control over.
1 reply →
Yes
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”
Curious.
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
My recollection is the computers do by default ask the user to set up biometrics
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
2 replies →
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
2 replies →
> faulty fingerprint sensor
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
2 replies →
My read on this is that she tried to bluff, even though the odds were astronomically high that they'd call her on it. She didn't have anything to lose by trying a little white lie. It's what I would have done in the same situation, anyway.
Very much so, because the question is... did she set it up in the past?
How did it know the print even?
Why is this curious?
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
4 replies →
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
RAM encryption doesn’t prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; I’m talking about SoC crypto engines like those found in Apple’s M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldn’t trust any secure enclave never to be breached – that’s an impossible promise to make even though it would require a nation-state level attack – but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.
I recommend reading the AES-XTS spec, in particular the “tweak”. Or for AES-GCM look at how IV works.
I also recommend looking up PUF and how modern systems use it in conjunction with user provided secrets to dervie keys - a password or fingerprint is one of many inputs into a kdf to get the final keys.
The high level idea is that the key that's being used for encryption is derived from a very well randomized and protected device-unique secret setup at manufacturing time. Your password/fingerprint/whatever are just adding a little extra entropy to that already cryptographically sound seed.
Tl;dr this is a well solved problem on modern security designs.
4 replies →
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
> I would totally enable an "enhanced protection for external accessories" mode.
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
It isn’t. Settings > Privacy & Security > Wired Accessories
Set to ask for new accessories or always ask.
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
1 reply →
> it has a noticeable impact on device functionality.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
> I never attach my iPhone to anything that's not a power source.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
Previously, direct link to the court doc:
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://news.ycombinator.com/item?id=46843967)
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
It's why I keep my old iPhone XR on 15.x for jail breaking reasons. I purchased an a new phone specially for the later versions and online banking.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
2 replies →
Secure boot and verified system partition is supposed to help with that. It's for the same reason jailbreaks don't persist across reboots these days.
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
In China, there is only one way to deal with this situation: when the police summon you for the first time, do not bring your phone. Before the second summons, get a new phone or completely format your old one. However, this does not apply in cases of ongoing crimes or when someone is already wanted by the authorities, as they will not be given a second chance.
Depending on your jurisdiction faceid is safer than fingerprint, because faceid won’t unlock while your eyes are closed.
In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.
Good to know. You sure about this though ? I swear I've seen people use Face ID on someone who's sleeping
The flag is called: “Require Attention for Face ID”
100% sure about the legal situation in Germany.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. It’s to reduce the attack surface. The average user wouldn’t want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you haven’t called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably won’t help their cybersecurity as they wouldn’t be targeted anyway.
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues
Perhaps this could be a factor in why it's not a default setting
Can anyone speak to the relative safety or lack thereof using FaceID on individual apps while requiring a PIN to login to the device?
I have my phone setup this way because FaceID can be so convenient. I know it opens up more attack vectors than not using it but is it possible for a powerful actor to utilize the fact that it is enabled at all to gain access to a locked phone?
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
The NSA is not going to tip its hand about any backdoors it had built into the hardware for something as small as this.
It depends on if parallel reconstruction can be used to provide deniability.
3 replies →
My Google pixel 5a randomly requires the pin/password every couple of days and will not accept biometrics. I have always assumed this was to heavily discourage using long passwords for this very reason.
Can't they just use Pegasus or Cellebrite???
It's unlikely that Pegasus would work since Apple patched the exploit it used.
I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.
Samsung phones have the Secure Folder which can have a different, more secure password and be encrypted when the phone is on.
Secure folder uses or is in the process of starting to use Android native feature private space, which is available on all Android 15 phones.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
you can check the github https://github.com/cryptomator/ios
3 replies →
The NSA definitely has easier ways to steal your identity and savings if they wanted to anyways
We need a Lockdown mode for MacBooks as well!
Looks like it’s a feature: https://support.apple.com/en-us/105120
To save a click:
* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.
* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.
* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.
What is she investigated for?
They're not actually investigating her, they're investigating a source that leaked her classified materials.
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
1 reply →
Did the individual store the classified material in the bathroom at his beach-side resort?
Every time I see these articles about iphones posing trouble for authorities, I always think of it as free (and fraudulent) advertisement.
I could be naive, but just don't think they'd really have any difficulty getting what they needed. Not that I give a fuck, but I guess I've seen one too many free ads.
Little too late for 1000 people hacked by pegasus.
I guess they got a 404
Given Cook's willing displays of fealty to Trump this time around I wouldn't be shocked if they were to remove lockdown mode in a future release.
For now! They’ll get something from open market like the last time when Apple refused to decrypt (or unlock?) a phone for them.
Yeah this is low stakes stuff, Pegasus historically breaks Apple phones easy. Bezos's nudes and Khashoggi knows. (not really Khashoggi is dead)
[flagged]
Both of your comments here, posted just one minute apart yet with completely different content, reek of LLM output.
Thanks and please see https://news.ycombinator.com/item?id=46888857.
People probably didn't see the other post, but both posts are several paragraphs and posted the same minute. No human would do that.
Its also a new account that only posted these two posts.
6 replies →
So what, if the content is good?
Also, some of us draft our comments offline, and then paste them in. Maybe he drafted two comments?
Posting sibling comments is unusual.
2 replies →
Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.
What’s so hard to make 2-3 pins and each to access different logged in apps and files.
If Apple/android was serious about it would implement it, but from my research seems to be someone that it’s against it, as it’s too good.
I don’t want to remove my Banking apps when I go travel or in “dangerous” places. If you re kidnapped you will be forced to send out all your money.
Absolutely every aspect of it?
What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?
What could be hard about that?
38 replies →
> Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.
7 replies →
It's more a policy problem than a phone problem. Apple could add as many pins as they want but until there are proper legal based privacy protections, law enforcement will still just be like "well how do we know you don't have a secret pin that unlocks 40TB of illegal content? Better disappear you just to be sure"
For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.
Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.
Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"
Hannah Natanson is not in prison though.
How does "go to prison for not showing" work when a lot of constitutions have a clause for a suspect not needing to participate in their own conviction / right to remain silent?
A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.
7 replies →
Assuming the rule of law is still functioning, there are multiple protections for journalists who refuse to divulge passwords in the USA. A journalist can challenge any such order in court and usually won't be detained during the process as long as they show up in court when required and haven't tried to destroy evidence.
Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.
1 reply →
They are willing to kill people and then justify it by calling them terrorists. Plausible deniability is pointless.
2 replies →
Fourth and Fifth amendments disagree
5 replies →
There is no plausible deniability here, that's only relevant in a rule-of-law type of situation, but then you wouldn't need it as you can't be legally compelled to do that anyway. "We don't see any secret source communication on your work device = you entered the wrong pin = go think about what your behavior in jail"
Even if this worked (which would be massively expensive to implement) the misconfiguration possibilities are endless. It wouldn't be customer-centric to actually release this capability.
Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)
Completely separate decision with a higher legal bar for doing that.
It's one thing to allow police to search a phone. Another to compel someone to unlock the device.
We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.
“Plausible deniability” is a public relations concept. It doesn’t confer any actual legal protection.
3 replies →
Yep, you need an emergency mode that completely resets the phone to factory settings, maybe triggered with a decoy pin. Or a mode that physically destroys the chip storing the keys
I always wondered if this was the feature of TrueCrypt that made it such a big target. LUKS is fine, I guess, but TrueCrypt felt like actual secrecy.
You do not. We have this thing in our constitution called the 5th amendment. You cannot be forced to divulge the contents of your mind, including your pin or passwords. Case law supports this. For US citizens at least. Hopefully the constitution is still worth something.
15 replies →
Why are you on a website for programmers and software developers if you arent a software developer and you know nothing of the subject?
1 reply →
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
I've been advocating for this under-duress-PIN feature for years, as evidenced by this HN comment I made about 9 years ago: https://news.ycombinator.com/item?id=13631653
Maybe someday.
Serious question: What are the "valid concerns" about people securing their computing devices against third parties?
This (I think) refers not to the people securing their devices against third parties but the vendors "securing" the devices against loss of profits.
Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.
___
_Ideally_ you wouldn't need to trust Apple as a corp to do the right thing. Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.
That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.
For completeness, No, I do not know either how this could be implemented differently.
8 replies →
One valid concern about "locked down computing" is the potential for 3rd parties to secure computing devices against their owners.
In this case I think "valid concerns about locked down computing" is referring to the owner's use of the phone being restricted, so that they can't download applications they want to use, they don't have unrestricted access to the filesystem, they are forced to pay an Apple commission to engage in certain forms aloft commerce, etc. These may be acceptable tradeoffs but they're valid concerns nonetheless.
I don't have to have any concern to be able to secure my device against third parties, it's just good operational discipline.
I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.
That's all.
It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.
Security is not something we shall earn, but shall have at the highest level by default.
Corrupt government officials gunning down inconvenient people.
8 replies →
Lockdown mode significantly effects the usability of the phone.
It completely disables JIT js in Safari for example.
9 replies →
Pegasus.
Jedi.
SKyWIper.
Rogue Actors.
Rogue thief’s.
Rogue governments.
Your spouse.
Separating corporate IT from personal IT.
There’s plenty of reasons.
2 replies →
Oh, come on. Don't look at another man's Portal Gun history. We all go to weird places.
I get so annoyed by this Socratic line of questioning because it’s extremely obvious.
Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.
But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.
So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.
Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.
19 replies →
Think of the children
1 reply →
Some platforms will side-load anything the telecom carrier sends.
It is naive to assume iOS can be trusted much more than Android. =3
2 replies →
> It's a real world example of how these security features aren't just for "paranoid people" but serve a legit purpose for people who handle sensitive info.
Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes
In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.
It just depends on how much a government wants the data that is stored there.
Which countries actually grant reporters immunity from having to reveal information related to criminal investigations (where others would be compelled to, and without criminal penalties)? Such immunity may be desirable (at least in some circumstances), but I am not aware of any jurisdiction that actually grants it.
2 replies →
Indeed, likely as secure as the VPNs run by intelligence contractors.
1. iOS has well-known poorly documented zero-click exploits
2. Firms are required to retain your activity logs for 3 months
3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.
It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3
> 1. iOS has well-known poorly documented zero-click exploits
PoC || GTFO, to use the vernacular.
If you're talking about historical bugs, don't forget the update adoption curves.
2 replies →
With the US descending more and more into fascism (as this case highlights yet again), I wonder what will happen to these features in the future. Especially now that the tech moguls of silicon valley stopped standing up to Trump and instead started kissing his ass. Tim Cook in particular seems to be the kind of person that rather is on the rich side of history than the right side. What if the administration realizes they can easily make Apple et al. give up their users by threatening their profits with tariffs and taxes?
How is it turning into fascism?
5 replies →
Apple seems to strongly discourage the use of lockdown mode. Presumably it is in conflict with their concern over share price and quarterly earnings.
How do they discourage it? It’s a clearly-labeled button in the Settings app, which brings up one modal sheet explaining what will change if you turn it on, then one more button press and it’s on.
Citation needed?
Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.
1 reply →
Didn’t they make it?
1 reply →
[flagged]
`hnrayst` seems to be another AI (?) bot account created in 2022 with only two comments, both being in this very thread we're in today:
https://news.ycombinator.com/threads?id=hnrayst
Something weird is going on at Hacker News recently. I've been noticing these more and more.
[dead]
Takeaway is to not enable biometric unlock if you are concerned about your data being accessed by authorities.
Trick is not to use your right index finger as a biometric unlock finger (the button sits on the top right corner of the keyboard). If you are "forced" to unlock, the agents will guide your fingers and probably try that first 2-3 times. 2 more tries, and fingerprint reading gets disabled. Quite good odds.
This has long been true. In a pinch you can mash the power button 5+ times to require a key code at next unlock.
4 replies →
So in america, they can force you to use a biometric but they can't compel you to reveal your password?
I mean, i agree with you, but its a really weird line in the sand to draw
18 replies →
It's interesting because the latest Cellebrite data sheets showed them to support all iPhones including e.g. unbooted, but apparently not lockdown mode? It also showed they hadn't cracked GrapheneOS.
Wait, was this an oversight on his part about the biometric unlock? My MacBook biometric gets disabled after a bit and requires a password if the lid was closed for substantial amount of time.
Does anyone know if iOS in lockdown mode stops syncing mail, imessage, call history etc to your other apple devices? I am wondering if reporter's stuff was all synced to the non lockdown MacBook from the iPhone
They usually ask you to enable lockdown mode on all your devices for advanced protection, even though you can skip it if you want.
1 reply →
https://support.apple.com/en-us/105120
Looks like lockdown mode is focused on blocking inbound threats, not the sharing of data from the device.
I can't imagine it would. The accounts don't flow through the phone you're just logged in to them on both devices.
> (forced her finger on Touch ID per the warrant)
Can anyone link a source for this? I’ve been seeing conflicting claims about this part.
https://news.ycombinator.com/item?id=46886694
1 reply →
> forced her finger on Touch ID per the warrant
She was not forced, and the warrant does not state that she could be forced. The warrant, almost certainly deliberately, uses far milder language.
The warrant is the force, current jurisprudence largely says warrant do compel people to provide biometric unlocks because it's not speech the same way giving up a password/passcode would be. Blocking or not complying with a signed warrant from a judge is it's own crime and the only safe way to fight them is with a lawyer in court not with the officer holding the paper (and gun/taser/etc with the power of the state behind them).
What do you think warrants are? You think they get a warrant and they say, "Can you put your finger on the device?" You say, "No," and that's it? If all they wanted to do was ask you, they would just ask you without the warrant.
6 replies →
By definition a warrant is force backed by state violence
You’re saying she complied willingly?
9 replies →
[flagged]
Do you disagree with the facts of the article? Or is it propaganda simply because the facts doesn't support your narrative and ideological inclinations?
Selective amplification of true events as well as selective reporting are bread and butter of modern propaganda. It works a lot better than saying outright falsehoods, which - in the long-term - cause people to lose faith in everything you have to say. And there's always someone jumping to your defense - after all you did not outright lie...
2 replies →
Man people are whiny about this on Hacker News when they should know better. There is no real computer security without hardware roots of trust and keystores
[flagged]
> full-drive encryption
Note that these are not crackable only if you have a strong password (random one will work). Unlike on phones, there is nothing slowing down brute force attempts, only the comparatively much weaker PBKDFs if you use a password. You want at least about 64 bits of entropy, and you should never use that password anywhere else, since they would basically run "strings" on your stuff to attempt the brute force.
Worse than that most phones are using smart enclave like chips protected by a 4 digit PIN that can be voltage drained to try every combo without a wipe.
> ---- All above is pure fantasy and never happened, as you probably have already guessed.
Ah, while I was a bit suspicious, I thought it might be real (weirdly worded). What exactly is the point of fabricating this?- Is there a joke I'm blind to?
No joke, it is just I don't like to leave any trail about law issues, even if it is hardly a menace. This last sentence is for law enforcement in the really hard to imagine case it might be relevant sometime.
They just need to ask apple to unlock it. And they can't really refuse under US law
They can refuse, and they have refused. See San Bernardino and the concept of "compelled work".
That was the old US law, not the one where Tim Cook delivered gold bars to Trump
1 reply →
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.