Comment by cornholio
3 hours ago
Beyond the crypto architecture debate, I don't really understand how could anyone imagine a world where MS could just refuse such a request. How exactly would we draft laws to this effect, "the authorities can subpoena for any piece of evidence, except when complying to such a request might break the contractual obligations of a third party towards the suspect"?
Do we really, really, fully understand the implications of allowing for private contracts that can trump criminal law?
They could just ask before uploading your encryption key to the cloud. Instead they force people to use a Microsoft Account to set up their windows and store the key without explicit consent
That's a crypto architecture design choice, MS opted for the user-friendly key escrow option instead of the more secure strong local key - that requires a competent user setting a strong password and saving recovery codes, understanding the disastrous implication of a key loss etc.
Given the abilities of the median MS client, the better choice is not obvious at all, while "protecting from a nation-state adversary" was definitely not one of the goals.
While you're right, they also went out of their way to prevent competent users from using local accounts and/or not upload their BitLocker keys.
I could understand if the default is an online account + automatic key upload, but only if you add an opt-out option to it. It might not even be visible by default, like, idk, hide it somewhere so that you can be sure that the median MS user won't see it and won't think about it. But just fully refusing to allow your users to decide against uploading the encryption key to your servers is evil, straight up.
9 replies →
Yes and they had to lie to sell that option.
If they honestly informed customers about the tradeoff between security and convenience they'd certainly have far fewer customers. Instead they lead people to believe that they can get that convenience for free.
The obvious better choice is transparancy.
1 reply →
Protecting from a nation state adversary should probably be a goal for the kind of enterprise software MS sells.
Protecting from specifically the nation state that hosts and regulates Microsoft and its biggest clients, probably not.
This is a consent issue, and visibility thereof, not "crypto architecture"
It makes sense if you consider the possibility of a secret deal between the government and a giant corporation. The deal is that people's data is never secure.
It's a nightmare actually.
The alternative is just not having FDE on by default, it really isn't "require utterly clueless non-technical users to go through complicated opt-in procedure for backups to avoid losing all their data when they forget their password".
And AFAICT, they do ask, even if the flow is clearly designed to get the user to back up their keys online.
> The alternative is just not having FDE on by default
yes, it would be. So, the current way, 99% of people are benefitting from knowing their data is secure when very common thefts occur, and 1% of people have the same outcome as if their disk was unencrypted: When they're arrested and their computers seized, the cops have their crime secrets. What's wrong?
No, encryption keys should never be uploaded to someone else's computer unencrypted. The OOBE should give users a choice between no FDE or FDE with a warning that they should not forget their password or FDE and Microsoft has their key and will be able to recover their disk and would be compelled to share the key with law enforcement. By giving the user the three options with consequences you empower the user to address their threat model how they see fit. There is no good default choice here. The trade offs are too varied.
1 reply →
Forcing implies there are zero ways to begin with a local only account (or other non-Microsoft Account). That's simply not true.
Disagree. If the path is shrouded behind key presses and commands which are unpublished by MS (and in some instances routes that have been closed), it may as well be.
4 replies →
> How exactly would we draft laws to this effect, "the authorities can subpoena for any piece of evidence, except when complying to such a request might break the contractual obligations of a third party towards the suspect"?
Perhaps in this case they should be required to get a warrant rather than a subpoena?
Encrypt the BL key with the user's password? I mean there are a lot of technical solutions besides "we're gonna keep the BL keys in the clear and readily available for anyone".
For something as widely adopted as Windows, the only sensible alternative is to not encrypt the disk by default.
The default behavior will never ever be to "encrypt the disk by a key and encrypt the key with the user's password." It just doesn't work in real life. You'll have thousands of users who lost access to their disks every week.
While this is true, why even bother turning on encryption and making it harder on disk data recovery services in that case?
Inform, and Empower with real choices. Make it easy for end users to select an alternate key backup method. Some potential alternatives: Allow their bank to offer such a service. Allow friends and family to self host such a service. Etc.
This is a bit tricky as it couples the user's password with the disk encryption key. If a user changes the password they would then need to change the encryption key, or remember the previous (possibly compromised) password. A better option is to force the user to record a complex hash, but that's never going to be user friendly when it comes to the average computer user.
Basically, we need better education about the issue, but as this is the case with almost every contentious issue in the world right now, I can't imagine this particular issue will bubble to the top of the awareness heap.
I thought this was what happened. Clearly not :( That’s the idea with services like 1Password (which I suppose is ultimately doing the same thing) - you need both the key held on the device and the password.
I suppose this all falls apart when the PC unlock password is your MS account password, the MS account can reset the local password. In Mac OS / Linux, you reset the login password, you loose the keychain.
In case of 1password, I would think it would be challenging to do what you are saying, at least for shared password vaults.
At this point, end-to-end encryption is a solved problems when password managers exist. Not doing it means either Microsoft doesn't care enough, or is actually interested on keeping it this way
I wouldn't call the problem "solved" just because of password managers.
Password managers shift the paradigm and the risk factors. In terms of MFA, a password in your manager is now "something you have" rather than "something you know". The only password I know nowadays is my sign-in password that unlocks the password manager's vault. So the passwords to my bank, my health care, my video games are no longer "in my fingers" or in my head anymore, they're unknown to me!
So vault management becomes the issue rather than password management. If passwords are now "something you have" then it becomes possible to lose them. For example, if my home burns down and I show up in a public library with nothing but the clothes on my back, how do I sign into my online accounts? If the passwords were in my fingers, I could do this. But if they require my smartphone to be operational and charged and having network access, and also require passwords I don't know anymore, I'm really screwed at that library. It'd be nearly impossible for me to sign back in.
So in the days of MFA and password managers, now we need to manage the vaults, whether they're in the cloud or in local storage, and we also need to print out recovery codes on paper and store them securely somewhere physical that we can access them after a catastrophe. This is an increase in complexity.
So I contend that password managers, and their cousins the nearly-ubiquitous passkeys, are the main driving factor in people's forgetting their passwords and forgetting how to sign-in now, without relying on an app to do it for them. And that is a decrease in opsec for consumers.
Sure that's valid, they do need to conply with legal orders. But they don't need to store bitlocker keys in the first place, they only need to turn over data they actually have.
I don't think that many people here are naive enough to believe that any business would fight the government for the sake of its customers. I think most of us are simply appalled by this blatantly malicious behavior. I'm not buying all these "but what if the user is an illiterate, senile 90-year-old with ADHD, huh?" attempts to rationalize it away. it's the equivalent of the guy who installed your door keeping a copy of your keys by unspoken default - "what if your toddler locks himself out, huh?"
I know the police can just break down my door, but that doesn't mean I should be ok with some random asshole having my keys.
Assume good intent. If Microsoft didn't escrow the keys, the next HN post would be "mIcR0SofT Ate mY chILDhooD pHOTos!!"
This make little to no sense.
This is being reported on because it seems newsworthy and a departure from the norm.
Apple also categorically says they refuse such requests.
It's a private device. With private data. Device and data owned by the owner.
Using sleight of hand and words to coax a password into a shared cloud and beyond just seems to indicate the cloud is someone else's computer, and you are putting the keys to your world and your data insecurely in someone else's computer.
Should windows users assume their computer is now a hostile and hacked device, or one that can be easily hacked and backdoored without their knowledge to their data?
The Bernardino incident is a very different issue where Apple refused to use its own private key to sign a tool that would have unlocked any iPhone. There is absolutely no comparison between Apple's and MS conduct here because the architectures of the respective systems are so different (but of course, that's a choice each company made).
Should Apple find itself with a comparable decryption key in its possession, it would have little options but to comply and hand it over.
Firstly, Apple does not refuse such requests. In fact, it was very widely publicized in the past couple of weeks that Apple has removed Advanced Data Protection for users in the UK. So while US users still enjoy Advanced Data Protection from Apple, UK users do not.
It is entirely possible that Apple's Advanced Data Protection feature is removed legally by the US as well, if the regime decides they want to target it. I suspect there are either two reasons why they do not: Either the US has an additional agreement with Apple behind the scenes somewhere, OR the US regime has not yet felt that this was an important enough thing to go after.
There is precedent in the removal, Apple has shown they'll do the removal if asked/forced. What makes you think they wouldn't do the same thing in the US if Trump threatened to ban iPhone shipments from China until Apple complied?
The options for people to manage this stuff themselves are extremely painful for the average user for many reasons laid out in this thread. But the same goes for things like PGP keys. Managing PGP keys, uploading to key servers, using specialized mail clients, plugging in and unplugging the physical key, managing key rotation, key escrow, and key revocation. And understanding the deep logic behind it actually requires a person with technical expertise in this particular solution to guide people. It's far beyond what the average end user is ever going to do.
You seem to be forgetting the time the Obama administration asked Apple to unlock a suspect’s iPhone and Apple refused.
> don't really understand how could anyone imagine a world where MS could just refuse such a request
By simply not having the ability to do so.
Of course Microsoft should comply with the law, expecting anything else is ridiculous. But they themselves made sure that they had the ability to produce the requested information.
Right, Microsoft have the ability to recover the key, because average people lose their encryption keys and will blame Microsoft if they can't unlock their computer and gain access to their files. BitLocker protects you from someone stealing your computer to gain access to your files, that's it. It's no good in a corporate setting or if you're worried about governments spying on you.
I'm honestly not entirely convinced that disk encryption be enabled by default. How much of a problem was stolen personal laptops really? Corporate machine, sure, but leave the master key with the IT department.
> Do we really, really, fully understand the implication of allowing private contracts that trump criminal law?
...it's not that at all. We don't want private contracts to enshrine the same imbalances of power; we want those imbalances rendered irrelevant.
We hope against hope that people who have strength, money, reputation, legal teams, etc., will be as steadfast in asserting basic rights as people who have none of those things.
We don't regard the FBI as a legitimate institution of the rule of law, but a criminal enterprise and decades-long experiment in concentration of power. The constitution does not suppose an FBI, but it does suppose that 'no warrant shall issue but upon probable cause... particularly describing the place to be searched, and the persons or things to be seized' (emphasis mine). Obviously a search of the complete digital footprint and history of a person is not 'particular' in any plain meaning of that word.
...and we just don't regard the state as having an important function in the internet age. So all of its whining and tantrums and pepper spray and prison cells are just childish clinging to a power structure that is no longer desirable.
I think legally the issue was adjudicated by analogy to a closed safe: while the exact contents of the safe is unknown beforehand, it is reasonable it will contain evidence, documents, money, weapons etc. that are relevant, so if a warrant can be issued in that case compelling a locksmith to open it, then by analogy it can be issued against an encrypted device.
Without doubt, this analogy surely breaks down as society changes to become more digital - what about a Google Glass type of device that records my entire life, or the glasses of all people detected around me? what about the device where I uploaded my conscience, can law enforcement simply probe around my mind and find direct evidence of my guilt? Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
My question was more general: how could we draft that new social contract to the current age, how could we maintain the balance where the encrypted device of a suspected child predator and murderer is left encrypted, despite the fact that some 3rd party has the key, because we agreed that is the correct way to balance freedoms and law enforcement? It just doesn't sound stable in a democracy, where the rules of that social contract can change, it would contradict the moral intuitions of the vast majority.
> so if a warrant can be issued in that case compelling a locksmith to open it, then by analogy it can be issued against an encrypted device.
But it isn't a warrant, it's a subpoena. Also, the locksmith isn't the one compelled to open it; if the government wants someone to do that they have to pay them.
> Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
The Fourth Amendment was enacted in 1791. A process to change it exists, implying that the people could change it if they wanted to, but sometimes they get it pretty right to begin with. And then who are these asshats craving access to everyone's "papers and effects" without a warrant?