"You can only turn off this setting 3 times a year."
Astonishing. They clearly feel their users have no choice but to accept this onerous and ridiculous requirement. As if users wouldn't understand that they'd have to go way out of their way to write the code which enforces this outcome. All for a feature which provides me dubious benefit. I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?
Privacy legislation is clearly lacking. This type of action should bring the hammer down swiftly and soundly upon these gross and inappropriate corporate decision makers. Microsoft has needed that hammer blow for quite some time now. This should make that obvious. I guess I'll hold my breath while I see how Congress responds.
It's hilarious that they actually say that right on the settings screen. I wonder why they picked 3 instead of 2 or 4. Like, some product manager actually sat down and thought about just how ridiculous they could be and have it still be acceptable.
My guess is it was an arbitrary guess and the limit is due to creating a mass scan of photos. Depending on if they purge old data when turned off, it could mean toggling the switch tells microsoft's servers to re-scan every photo in your (possibly very large) library.
Odd choice and poor optics (just limit the number of times you can enable and add a warning screen) but I wouldn't assume this was intentionally evil bad faith.
3 is the smallest odd prime number. 3 is a HOLY number. It symbolizes divine perfection, completeness, and unity in many religions: the Holy Trinity in Christianity, the Trimurti in Hinduism, the Tao Te Ching in Taoism (and half a dozen others)
The number seems likely to be a deal that could be altered upward someday for those willing to rise above the minimal baseline tier.
Right now it doesn't say if these are supposed to be three different "seasons" of the year that you are able to opt-out, or three different "windows of opportunity".
Or maybe it means your allocation is limited to three non-surveillance requests per year. Which should be enough for average users. People aren't so big on privacy any more anyway.
Now would these be on a calendar year basis, or maybe one year after first implementation?
And what about rolling over from one year to another?
> Why is Microsoft so eager to also be able to know this?
A database of pretty much all Western citizen's faces? That's a massive sales opportunity for all oppressive and wanna-be oppressive governments. Also, ads.
I agree with you but there's nothing astonishing about any of this unfortunately, it was bound to happen. Almost all of cautionary statements about AI abuse fall on deaf ears of HN's overenthusiastic and ill-informed rabble, stultified by YC tech lobbyists.
Worst part about it was all the people fretting on about ridiculous threats like the chatbot turning into skynet sucked the oxygen out of the room for the more realistic corporate threats
Actually, most users probably don't understand, that this ridiculous policy is more effort to implement. They just blindly follow whatever MS prescribes and have long given up on making any sense of the digital world.
Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.
My assumption is that when this feature is on and you turn it off, they end up deleting the tags (since you've revoked permission for them to tag them). If it gets turned back on again, I assume that means they need to rescan them. So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.
Their disclaimer already suggests they don't train on your photos.
This is Microsoft. They have a proven record of turning these toggles back on automatically without your consent.
So you can opt out of them taking all of your most private moments and putting them into a data set that will be leaked, but you can only opt out 3 times. What are the odds a "bug" (feature) turns it on 4 times? Anything less than 100% is an underestimate.
And what does a disclaimer mean, legally speaking? They won't face any consequences when they use it for training purposes. They'll simply deny that they do it. When it's revealed that they did it, they'll say sorry, that wasn't intentional. When it's revealed to be intentional, they'll say it's good for you so be quiet.
If that was the case, the message should be about a limit on re-enabling the feature n times, not about turning it off.
Also the if they are concerned about processing costs, the default for this should be off, NOT on. The default should for any feature like this that use customers personal data should be OFF for any company that respects their customers privacy.
> You are trying to reach really far out to find a plausible
This behavior tallies up with other things MS have been trying to do recently to gather as much personal data as possible from users to feed their AI efforts.
Their spokes person also avoided answering why they are doing this.
On the other hand, you comment seem to be trying to reach really far trying to find portray this as normal behavior.
Yeah exactly. Some people have 100k photo collections. The cost of scanning isn’t trivial.
They should limit the number of times you turn it on, not off. Some PM probably overthought it and insisted you need to tell people about the limit before turning it off and ended up with this awkward language.
If it was that simple, there would be no practical reason to limit that scrub to three ( and in such a confusion inducing ways ). If I want to waste my time scrubbing, that should be up to me -- assuming it is indeed just scrubbing tagged data, because if anything should have been learned by now, it is that:
worst possible reading of any given feature must be assumed to the detriment of the user and benefit of the company
Honestly, these days, I do not expect much of Microsoft. In fact, I recently thought to myself, there is no way they can still disappoint. But what do they do? They find a way damn it.
> Their disclaimer already suggests they don't train on your photos.
We know all major GenAI companies trained extensively on illegally acquired material, and they were hiding this fact. Even the engineers felt this isn't right, but there were no whistleblowers. I don't believe for a second it would be different with Microsoft. Maybe they'd introduce the plan internally as a kind of CSAM, but, as opposed to Apple, they wouldn't inform users. The history of their attitude towards users is very consistent.
Then you would limit the number of times the feature can be turned on, not turned off. Turned off uses less resources, while turned on potentially continues using their resources. Also I doubt if they actually remove data that requires processing to obtain, I wouldn't expect them to delete it until they're actually required to do so, especially considering the metadata obtained is likely insignificant in size compared to the average image.
It's an illusion of choice. For over a decade now companies either are spamming you with modals/notifications up until you give up and agree to a compromising your privacy settings or "accidentally" turn these on and pretend that change happened by a mistake or bug.
Language used is deceptive and comes with "not now" or "later" options and never a permanent "no". Any disagreement is followed by a form of "we'll ask you again later" message.
Companies are deliberately removing user's control over software by dark patterns to achieve their own goals.
Advanced user may not want to have their data scanned for whatever reasons and with this setting it cannot control the software because vendor decided it's just 3 times and later settings goes permanent "on".
And considering all the AI push within Windows, Microsoft products is rather impossible to assume that MS will not be interested in training their algorithms on their customers/users data.
---
And I really don't know how else you can interpret this whole talk with an unnamed "Microsoft's publicist" when:
> Microsoft's publicist chose not to answer this question
and
> We have nothing more to share at this time
but as a hostile behavior. Of course they won't admit they want your data but they want it and will have it.
It sounds like you have revoked their permission to tag(verb) the photos, why should this interfere with what tag(noun) the photo already has?
But really I know nothing about the process, I was going to make an allegory about how it would be the same as adobe deleting all your drawings after you let your photoshop subscriptions lapse. But realized that this is exactly the computing future that these sort of companies want and my allegory is far from the proof by absurdity I wanted it to be. sigh, now I am depressed.
Honestly, I hated when they removed automatic photo tagging. It was handy as hell when uploading hundreds of pictures from a family event, which is about all I use it for.
You seem to be implying that users won't accept this. But users have accepted all the other bullshit Microsoft has pulled so far. It genuinely baffles me why anyone would choose to use their products yet many do and keep making excuses why alternatives are not viable.
If you don't trust Microsoft but need to use Onedrive, there are encrypted volume tools (e.g. Cryptomator) specifically designed for use with Onedrive.
"You can only turn off this setting 3 times a year."
I look forward to getting a check from Microsoft for violating my privacy.
I live in a state with better-than-average online privacy laws, and scanning my face without my permission is a violation. I expect the class action lawyers are salivating at Microsoft's hubris.
I got $400 out of Facebook because it tagged me in the background of someone else's photo. Your turn, MS.
They could have avoided the negative press by changing the requirement to be that you can’t re-enable the feature after switching it off 3 times per year.
It’s not hard to guess the problem: Steady state operation will only incur scanning costs for newly uploaded photos, but toggling the feature off and then on would trigger a rescan of every photo in the library. That’s a potentially very expensive operation.
If you’ve ever studied user behavior you’ve discovered situations where users toggle things on and off in attempts to fix some issue. Normally this doesn’t matter much, but when a toggle could potentially cost large amounts of compute you have to be more careful.
For the privacy sensitive user who only wants to opt out this shouldn’t matter. Turn the switch off, leave it off, and it’s not a problem. This is meant to address the users who try to turn it off and then back on every time they think it will fix something. It only takes one bad SEO spam advice article about “How to fix _____ problem with your photos” that suggests toggling the option to fix some problem to trigger a wave of people doing it for no reason.
I cancelled Facebook in part due to a tug-of-war over privacy defaults. They kept getting updated with some corporate pablum about how opting in benefited the user. It was just easier to permanently opt out via account deletion rather than keep toggling the options. I have no doubt Microsoft will do the same. I'm wiping my Windows partition and loading Steam OS or some variant and dual booting into some TBD Linux distro for development.
When I truly need Windows, I have an ARM VM in Parallels. Right now it gets used once a year at tax time.
But tomorrow they’ll add a new feature, with a different toggle, that does the same thing but will be distinct enough. That toggle will default on, and you’ll find it in a year and a half after it’s been active.
Control over your data is an illusion. The US economy is built upon corporations mining your data. That’s why ML engineers got to buy houses in the 2010s, and it’s why ML/AI engineers get to buy houses in the 2020s.
> - "Scan photos I upload" yes/no. No batch processing needed, only affects photos from now on.
This would create a situation where some of the photos have tags and some don’t. Users would forget why the behavior is different across their library.
Their solution? Google it and start trying random suggestions. Toggle it all on and off. Delete everything and start over with rescanning. This gets back to the exact problem they’re trying to avoid.
> - "Scan all missing photos (1,226)" can only be done 3x per year
There is virtually no real world use case where someone would want to stop scanning new photos but also scan all photos but only when they remember to press this specific button. The number of users who would get confused and find themselves in unexpected states of half-scanned libraries would outweigh the number of intentional uses of this feature by 1000:1 or more.
Tell you what, Microsoft: turn it off, leave it off, remove it, fire the developers who made it, forget you ever had the idea. Bet that saved some processing power?
Most of us wouldn't mind if the limitation was that you can't opt IN more than 3 times/year, but of course Microsoft dark patterned it to limit the opt outs.
> It’s not hard to guess the problem: toggling the feature off and then on would trigger a rescan of every photo in the library.
That's would be a wild way to implement this feature.
I mean it's Microsoft so I wouldn't be surprised if it was done in the dumbest way possible but god damn this would be such a dumb way to implement this feature.
This would be because of the legal requirement to purge (erase) all the previous scan data once a user opts out. So the only way to re-enable is to scan everything again — unless you have some clever way I’ve not thought of?
And not just advertising. If ICE asks Microsoft to identify accounts of people who have uploaded a photo of "Person X", do you think they're going to decline?
They'd probably do it happily even without a warrant.
I'd bet Microsoft is doing this more because of threats from USG than because of advertising revenue.
> They'd probably do it happily even without a warrant
I'm old enough to remember when companies were tripping over themselves after 9/11 trying to give the government anything they could to help them keep an eye on Americans. They eventually learned to monetize this, and now we have the surveillance economy.
> and follow Microsoft's compliance with General Data Protection Regulation
Not in a million years. See you in court. As often, just because a press statement says something, it's not necessarily true and maybe only used to defuse public perception.
Truly bizarre. I'm so glad I detached from Windows a few years back, and now when I have to use it or another MS product (eg an Xbox) it's such an unpleasant experience, like notification hell with access control checks to read the notifications.
The sad thing is that they've made it this way, as opposed to Windows being inherently deficient; it used to be a great blend of GUI convenience with ready access to advanced functionality for those who wanted it, whereas MacOS used to hide technical things from a user a bit too much and Linux desktop environments felt primitive. Nowadays MS seems to think of its users as if they were employees or livestock rather than customers.
They are like shitty Midas, everything they touch, turns into pile of crap. However people stil buy their products. They think the turd is tasty, because billion of flies can't we wrong...
Meanwhile Apple is applying different set of toxic patterns. Lack of interoperability with other OS, their apps try to store data mainly on iCloud, iPhone has no jack connector etc.
Insider here, in m365 though not onedrive. It did change, but not because of satya ; because of rules and legislation and bad press. Privacy and security are taken very seriously (at least by people who care to follow internal rules) not because "we're nice", but because
- EU governments keep auditing us, so we gotta stay on our toes, do things by the book, and be auditable
- it's bad press when we get caught doing garbage like that. And bad press is bad for business
In my org, doing anything with customer's data that isn't directly bringing them value is theoretically not possible. You can't deliver anything that isn't approved by privacy.
Don't forget that this is a very big company. It's composed of people who actually care and want to do the right thing, people who don't really care and just want to ship and would rather not be impeded by compliance processes, and people who are actually trying to bypass these processes because they'd sell your soul for a couple bucks if they could. For the little people like me, the official stance is that we should care about privacy very much.
Our respect for privacy was one of the main reasons I'm still there. There has been a good period of time where the actual sentiment was "we're the good guys", especially when comparing to google and Facebook. A solid portion of that was that our revenue was driven by subscriptions rather than ads. I guess the appeal to take customer's money and exploit their data is too big. The kind of shit that will get me to leave.
Do we even think that was real? I think social media has been astroturfed for a long time now. If enough people make those claims, it starts to feel true even without evidence to support it.
Did they ever open source anything that really make you think "wow"? The best I could see was them "embracing" Linux, but embrace, extend, extinguish was always a core part of their strategy.
This week I have received numerous reminders from Microsoft to renew my Skype credit..
Everything I see from that company is farcical. Massive security lapses, lazy AI features with huge privacy flaws, steamrolling OS updates that add no value whatsoever, and heavily relying on their old playbook of just buying anything that looks like it could disrupt them.
P.S. The skype acquisition was $8.5B in 2011 (That's $12.24B in today's money.)
I don't understand how this is losing their mind. Toggling this setting is expensive on the backend: opting in means "go and rescan all the photos". opting out means "delete all the scanned information for this user". As a user just make up your mind and set the setting. They let you opt in, they ley you opt out, they just don't want to let you trigger tons of work every minute.
By each passing day since I switched from using Windows to Linux at home, with decreasing friction, I am increasingly happy that I took time to learn Linux and stuck with it. This not a come to Linux call because I know it is easier said than done for most of non technical folks. But it is a testimony that if you do, the challenges eventually will be worth it. Because at this point, Microsoft is just openly insulting their captive users.
Do you think the PR person responding here feels, underneath it all, the inhumanity of their responses? The fact that they're merely wasting everyone's time with their prevaricated non-answers? Knowing what they need to say to keep their job but hurting internally at the stupidity of it all.
Or do they end up so enmeshed with the corporate machine that that they start to really believe it all makes sense?
I think - at least for the people who stick with a career in PR - that they enjoy playing the game of giving an answer that is sort of related to the question but doesn't actually give a single bit of useful information. That they enjoy seeing how far they can push it without the interviewer straight up accuse them of not answering the question.
At least that's the only way I can imagine them keeping their sanity.
I think HN skews towards a somewhat naive but good natured crowd. Every time ethics or morality comes up on here there is no shortage of defenders that simply don't want to accept the fact. Yes there are bad people out there that are not only ok with the bad things they do but even some that actively enjoy it and pursue more of it.
This is such a norm in society now; PR tactics take priority over any notion of accountability, and most journalists and publishers act as stenographers, because challenging or even characterizing the PR line is treated as an unjustified attack and inflated claims of bias.
Just as linking to original documents, court filings etc. should be a norm in news reporting, it should also be a norm to summarize PR responses (helpful, dismissive, evasive or whatever) and link to a summary of the PR text, rather than treating it as valid body copy.
People need to treat PR like they do AIs. "You utterly failed to answer the question, try again and actually answer the question I asked this time." I'd love to see corporate representatives actually pressed to answer. "Did you actually do X, yes or no, if you dodge the question I'll present you as dodging the question and let people assume the worst."
They take people for idiots. This can work a few times, but even someone who isn't the brightest will eventually put two and two together when they get screwed again and again and again.
The worst part of all this is even respectable news organisations like the BBC publish so many articles that are just the companies PR response verbatim. Even worse when it's like
- victim says hi, this thing is messed up and people need to know about this
-Company says "bla bla bla" legal speak we don't recognise an issue "bla bla bla"
End of article, instead of saying
"This comment doesn't seem to reflect the situation" or other pointing out that anybody with a brain can see the two statements are not equal in evidence nor truth
I think the EU is flawed in more ways that just one. But every time I see „<AI feature> will be available starting now outside EU“ I am really grateful
Meta just lost a court case against bits of freedom in the Netherlands, because their instagram setting to turn off the attention grabbing feed would reset every month or so. The court ruled that this infringed on the user’s freedom.
Microsoft gets a lot less difficult to reason about when we start to think of it as a statistical mean of human nature rather than the mind of one arbitrary evil bastard. They have 228k employees. The CEO has virtually zero direct influence on the end work product of any team.
Any organization this large is going to have approximately the same level of dysfunction overall. But, there are almost always parts of these organizations where specific leaders have managed to carve out a fiefdom and provide some degree of actual value to the customer. In the case of Microsoft, examples of these would be things like .NET, C#, Visual Studio [Code], MSSQL, Xbox.
Windows, Azure & AI are where most of the rot exists at Microsoft. Office is a wash - I am not a huge fan of what has happened to my Outlook install over the years, but Teams has dramatically stabilized since the covid days. Throwing away the rest of the apple because of a few blemishes is a really wasteful strategy.
Growing up, Microsoft dominance felt so strong. 3 decades later, there’s a really high chance my kids will never own or use a windows machine (unless their jobs gives them one).
Microsoft hate was something else in the '90s and 2000s. Yet people stayed with it as if they had no choice while OS/2, AmigaOS, NextStep, BeOS and all those UNIXes died.
A lot of people didn't and still don't. Sometimes your job/business requires certain software that is only available on windows. I'm not giving up my job for an OS. for the past 15 years or so I could do everything on Mac and Linux, but that might not always be the case. I certainly wouldn't pass up a lucrative consulting position because it was windows only.
an employer requires their workers to use Windows; the target audience for Windows is management, their HR and attorneys, and then greater security services. MSFT sells investigative services.
Back then people really didn't have much of a choice.
Nowadays most things happen in browsers anyways, WINE/Proton have come a long way, and alternatives to almost anything windows-only have reached a critical quality threshold.
Microsoft knows the vast majority of professionals are forced to use their products and services or else they can't put food on the table. That's why Microsoft can operate with near impunity.
Does this mean that when you disable, all labels are deleted, and when you turn it back on it has to re-scan all of your photos? Could this be a cost-saving measure?
Whenever I have to use Windows, I just create a new throwaway account on proton, connect it to the mother throwaway account connected to a yahoo email account created in the before times, install what I need, and then never access that account again.
Gitlab. Codeberg. Neocities. Nekoweb. Wasmer. Surge. Digital Ocean. Freehostia. Awardspace. 000webhost. Static.run. Kinsta. Cloudflare Pages. Render. Hostinger. Ionos. Bluehost. Firebase. Netlify. Orbiter. Heliohost. There's probably hundreds of services with a free tier these days (though many of them will have strict limitations on website size and traffic, and you may have to run the build step locally).
For private repos there is Forgejo, Gitea and Gitlab.
For open-source: Codeberg
Yes, it'll make projects harder to discover, because you can't assume that "everything is on github" anymore. But it is a small price to pay for dignity.
How is this not a revenge porn or something? If I upload sensitive photos somewhere, it is 5 years prison sentence! CEO of Microsoft can do that billion times!
Security issue for targeted people. What if an MS account gets compromised and a bad actor plants illegal material on the computer that is then scanned by the cloud before it is caught.
I was quite happy for a couple years to just use windows and wsl. Fully switched to Linux at home and Linux VM's at work. The thirst and desperation to make AI work gives me the creeps more than usual.
Almost feel like we are getting to class action or antitrust when you connect the dots. Almost all PCs come with Windows. Defacto you need to create a M$ account to use Windows locally. They opt you into one drive by default. They sync your docs by default. They upload all your photos into AI by default.
By "class action" I presume you're referring to the US. If so, no, the courts of law are forbidden to you. You will instead go to a secret tribunal where the laws do not matter. The arbiter will only continue to be paid if they continue to rule for corporations.
> Slashdot: What's the reason OneDrive tells users this setting can only be turned off 3 times a year? (And are those any three times — or does that mean three specific days, like Christmas, New Year's Day, etc.)
> [Microsoft's publicist chose not to answer this question.]
I say this about advertising and after recently using Win11 for the first time to remove malware, I was left with a gross feeling. My friend whose computer it was is not highly PC literate, but when I was talking about the AI shit built in to these platforms, you could see the disgust building.
This doesn't feel like a problem at all. I only need to turn the setting off once, right? My immediate question to seeing that verbiage was, "how many times does the setting turn itself on in a year?"
We created this oligopoly because they were convenient, free, powerful, and now its time for us to pay the price.
Or find services that may not be as easy to use, may cost something and may not have all the features you want, but which wont make unreasonable demands for your data.
In light of the way the US government is carrying on, I'd rather not give Microsoft any of my images.
"It's not your data citizen, you should be happy we made this OS for you. You are not smart enough to do it your self, we know what is best."
I can never help myself from hearing this inside, and am just incredibly thankful that we have Linux and FOSS in general. That really gives me hope for humanity at this point.
I type this in FireFox, on NixOS, with all my pics open in another tab, in Immich. Thank you, thank you, thank you.
Both Mac and Linux desktop/laptop machines are better and less loaded with shit. If you don’t need or want a full featured PC you have Android and iOS which are also better. Android you have to be careful of but if you pick well it can be customizable and less loaded with shit.
Steam is available for both Linux and macOS. Are there just not as many game titles? I just saw Cyberpunk show up in the Apple Store for Mac so there seems to be an effort to port more games off Windows.
I have a Windows VM but use it less and less. Only need now is to test and build some software for Windows.
Also: I realized what I do kind of like about Apple and how best to describe their ecosystem. It’s the devil you know. They are fairly consistent in their policies and they are better on privacy than others. Some of their policies suck, but they suck in known consistent ways.
If I left Apple, Linux (probably on Framework) is the only alternative.
> Steam is available for both Linux and macOS. Are there just not as many game titles?
A vast majority of the games work fine under Linux now, in fact most release these days woke even on day one of release. The only games that don't really work are ones which use invasive kernel-level anti-cheat systems.
Disabling offline accounts is one thing but scanning and labeling your files to profile users is a whole other can of worms. This trajectory leads to zero privacy for the user and I feel like switching to Linux/Mac will be the only option sadly.
It really seems as though Microsoft has total contempt for their retail/individual customers. They do a lot to inconvenience those users, and it often seems gratuitous and unnecessary. (As it does in this case.)
...I guess Microsoft believes that they're making up for it in AI and B2B/Cloud service sales? Or that customers are just so locked-in that there's genuinely no alternative? I don't believe that the latter is true, and it's hard to come back from a badly tarnished brand. Won't be long before the average consumer hates Microsoft as much as they hate HP (printers).
You should ask the EU what happed with the Digital Markets Act, and if it's possible to install arbitrary software on an iPhone today. If the answer is "no, you can't", then that should give you a hint of how effective the EU really is when it comes to these issues.
I don't really see the issue. If you don't want the face recognition feature, then you'll turn it off once, and that's that. Maybe if you're unsure, you might turn it off, and then back on, and then back off again. But what's the use case where you'd want to do this more than 3x per year?
Presumably, it's somewhat expensive to run face recognition on all of your photos. When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again.
If this is the true reason, then they have made some poor decisions throughout that still deserve criticism. Firstly by restricting the number of times you can turn it _off_ rather than _on_, secondly by not explaining the reason in the linked pages, and thirdly by having their publicist completely refuse to say a word on the matter.
In fact, if you follow the linked page, you'll find a screenshot showing it was originally worded differently, "You can only change this setting 3 times a year" dating all the way back to 2023. So at some point someone made a conscious decision to change the wording to restrict the number of times you can turn it _off_
Maybe what they see is that most people who turn it off will leave it off, but some people turn it off and turn it back on as a part of a pattern/habit around temporarily putting files on OneDrive they don't want to scan.
For example, people who don't use their encrypted vault on OneDrive, so they upload photos that should otherwise be encrypted to their normal OneDrive which gets scanned and tagged. It could be a photo of their driver's license, social security card, or something illicit.
So these users toggle the tagging feature on and off during this time.
Maybe the idea is to push these people's use case to the vault where it probably belongs?
Well, sometimes Microsoft decides to change your settings back. This has happened to me very frequently after installing Windows updates. I remember finding myself turning the same settings off time and again.
The "fuck you, user!" behavior of software companies now means there's no more "No", only "Maybe later". Every time I update Google Photos, it shows me the screen that "Photos backups are not turned on! Turn on now?" (because they want to upsell their paid storage space option).
> If you don't want the face recognition feature, then you'll turn it off once.
The issue is that is a feature that 100% should in any sane world be opt in - not opt out.
Microsoft privacy settings are a case of - “It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard.”
There's inherently nothing wrong with face recognition, I love being able to search my own photos on my iPhone. If you could keep it private, you totally would too.
Even KDE's Digikam can run "somewhat expensive" algorithms on your photos without melting your PC and making you wait a year to recognize and label faces.
Even my 10(?) year old iPhone X can do facial recognition and memory extraction on device while charging.
My Sony A7-III can detect faces in real time, and discriminate it from 5 registered faces to do focus prioritization the moment I half-press the shutter.
That thing will take mere minutes on Azure when batched and fed through GPUs.
If my hunch is right, the option will have a "disable AI use for x months" slider and will turn itself on without letting you know. So you can't opt out of it completely, ever.
> When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again
This is probably the case. But Redmond being Redmond, they put their foot in their mouth by saying "you can only turn off this setting 3 times a year" (emphasis mine).
But that's not necessarily true for everyone. And it doesn't need to be this way, either.
For starters I think it'd help if we understood why they do this. I'm sure there's a cost to the compute MS spends on AI'ing all your photos, turning it off under privacy rules means you need to throw away that compute. And turning it back on creates an additional cost for MS, that they've already spent for nothing. Limiting that makes sense.
What doesn't make sense is that I'd expect virtually nobody to turn it on and off over and over again, beyond 3 times, to the point that cost increases by more than a rounding error... like what type of user would do that, and why would that type of user not be exceedingly rare?
And even in that case, it'd make more sense to do it the other way around: you can turn on the feature 3 times per year, and off anytime. i.e. if you abuse it, you lose out on the feature, not your privacy.
So I think it is an issue that could and should be quickly solved.
The point is it’s sucking your data into some amorphous big brother dataset without explicitly asking you if you want that to happen first. Opt out AI features are generally rude, trashy, low-class, money grubbing data grabs
Right, while I understand the potential compute cost, it would be like the iPhone restricting the number of times you could use “allow once“ for location permissions.
Presumably, it's somewhat expensive to run face recognition on all of your photos.
Very likely true, but we shouldn't have to presume. If that's their motivation, they should state it clearly up front and make it opt-out by default. They can put a (?) callout on the UI for design decisions that have external constraints.
I wonder if it's possible to encrypt the index with a key that's copied to the user's device, and if the user wants to turn off this setting, delete the key on the server. When they want to turn it back on, the device uploads the key. Yes, the key might end up gone if there's a reinstall, etc.
If the user leaves it off for a year, then delete the encrypted index from the server...
How hard it to turn it on? Does it show a confirmation message?
My wife has a phone with a button on the side that opens the microphone to ask questions to Google. I guess 90% of the audios they get are "How the /&%/&#"% do I close this )(&(/&(%)?????!?!??"
I bought a new Motorola phone and there are no less than three ways to open Google assistant (side button, hold home button, swipe from corner). Took me about 10 seconds before I triggered it unintentionally and quickly figured out how to disable all of them...
"When this feature is disabled, facial recognition will be disabled immediately and existing recognition data will be purged within 60 days". Then you don't need a creepy message. Okay, so that's 6 times a year, but whatever.
So the message is: if you can, don't use OneDrive.
If you can't (work, etc.) try to avoid uploading sensitive documents in onedrive.
I always wondered who uses OneDrive for cloud storage. Hell, I think even Google Drive is better.
Microsoft has really pivoted to AI for all things. I wonder how many customers they will get vs how many they will lose due to this very invasive way of doing things.
This is once again strongly suggesting that Microsoft is thoroughly doomed if the money they've dumped into AI doesn't pan out. It seems to me that if your company is tied to Microsoft's cloud platform, you should probably consider moving away as quickly as you can. Paying the vmware tax and moving eveyrthing in house is probably a better move at this point.
> Microsoft only lets you opt out of AI photo scanning
Their _UI_ says they let you opt out. I wouldn't bet on that actually being the case. At the very least - a copy of your photos goes to the US government, and they do whatever they want with it.
I wonder if you can write a program to make pictures with face tattoos be the normal for Microsoft AI to train on, like see if enough people did this, if Microsoft's facial recognition started generating lots of face tats...
Microsoft gets most of its money from big corporate customers. Some of those customers are obligated by law to not leak sensitive personal data to servers in USA soil, because those customers have the missfortune of being in countries with strong privacy laws, functioning civil societies and sometimes even left-winged governments. I know for a fact that the product in question, "OneDrive", it's sometimes mandated in those companies as a backup solution for the company's computers. All it takes is a whistle-blowing incident or a chat with a journalist for this to become a major blow-up for Microsoft, with companies forced by tribunals to back off from contracts with Microsoft.
Why would anyone use this crap at this point? Buy a (possible used) mini PC or thin client, install Linux and Samba on it, and voila, your own private "cloud" completely free of corporate interference, spyware and recurring fees. This works best with a static IP for remote access via Wireguard but it can be made to work on a residential connection.
With a little more effort you can deploy Nextcloud, Home Assistant and a few other great FOSS projects and completely free yourself from Big Tech. The hardest part will probably be email on a residential connection, but it can be done with the help of a relay service for outgoing mail.
Isn't it cute when there's absolutely no rationale behind a new rule, and it's simply an incursion made in order to break down a boundary?
Look, scanning with AI is available!
Wow, scanning with AI is now free for everyone!
What? Scanning with AI is now opt-out?
Why would opting-out be made time-limited?
WTF, what's so special about 3x a year? Is it because it's the magic number?
Ah, the setting's gone again, I guess I can relax. I guess the market wanted this great feature, or else they wouldn't have gradually forced it on us. Anyway, you're a weird techie for noticing it. What do you have to hide?
There is a big rationale behind it. If their AI investments don't pan out, Microsoft will cease to exist. They've been out of ideas since the late 90s. They know that the subscription gravy train has already peaked. There is no more growth unless they fabricate new problems for which they will then force you to pay for the solution to the problem they created for you. Oh, your children were kidnapped because Microsoft sold their recognition and location data to kidnappers? Well you should have paid for Microsoft's identity protection E7 plus add-on subscription that prevents them from selling the data you did not authorize them to collect to entities that they should know better than to deal with.
I don't even get why they would need "ideas" or "growth" tbh. They have most popular desktop operating system, they have one of the most popular office suites, surely they make plenty of profit from those. If they just focused on making their existing products not shit they would remain a profitable company indefinitely. But instead they're enshittifying everything because they want more More MORE
I'm kind of surprised that it is Microsoft leading the field in this. It seems like something that'd be much more at home on an Apple or Google smartphone. But I suppose smartphones don't have the hardware or network power or resources to pull this off without noticibly degrading the smartphone performance.
"You can only turn off this setting 3 times a year."
Astonishing. They clearly feel their users have no choice but to accept this onerous and ridiculous requirement. As if users wouldn't understand that they'd have to go way out of their way to write the code which enforces this outcome. All for a feature which provides me dubious benefit. I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?
Privacy legislation is clearly lacking. This type of action should bring the hammer down swiftly and soundly upon these gross and inappropriate corporate decision makers. Microsoft has needed that hammer blow for quite some time now. This should make that obvious. I guess I'll hold my breath while I see how Congress responds.
It's hilarious that they actually say that right on the settings screen. I wonder why they picked 3 instead of 2 or 4. Like, some product manager actually sat down and thought about just how ridiculous they could be and have it still be acceptable.
My guess is it was an arbitrary guess and the limit is due to creating a mass scan of photos. Depending on if they purge old data when turned off, it could mean toggling the switch tells microsoft's servers to re-scan every photo in your (possibly very large) library.
Odd choice and poor optics (just limit the number of times you can enable and add a warning screen) but I wouldn't assume this was intentionally evil bad faith.
13 replies →
3 is the smallest odd prime number. 3 is a HOLY number. It symbolizes divine perfection, completeness, and unity in many religions: the Holy Trinity in Christianity, the Trimurti in Hinduism, the Tao Te Ching in Taoism (and half a dozen others)
3 replies →
The number seems likely to be a deal that could be altered upward someday for those willing to rise above the minimal baseline tier.
Right now it doesn't say if these are supposed to be three different "seasons" of the year that you are able to opt-out, or three different "windows of opportunity".
Or maybe it means your allocation is limited to three non-surveillance requests per year. Which should be enough for average users. People aren't so big on privacy any more anyway.
Now would these be on a calendar year basis, or maybe one year after first implementation?
And what about rolling over from one year to another?
Or is it use it or lose it?
Enquiring minds want to know ;)
Manager: "Three is the number thou shall permit, and the number of the permitting shall be -- three."
1 reply →
> Why is Microsoft so eager to also be able to know this?
A database of pretty much all Western citizen's faces? That's a massive sales opportunity for all oppressive and wanna-be oppressive governments. Also, ads.
At this point I think it's just called a government, sadly enough.
Combine face recognition on personal photos with age checks which include photos,and you can link stuff directly to Microsoft/Google accounts for ads
I agree with you but there's nothing astonishing about any of this unfortunately, it was bound to happen. Almost all of cautionary statements about AI abuse fall on deaf ears of HN's overenthusiastic and ill-informed rabble, stultified by YC tech lobbyists.
Worst part about it was all the people fretting on about ridiculous threats like the chatbot turning into skynet sucked the oxygen out of the room for the more realistic corporate threats
3 replies →
Actually, most users probably don't understand, that this ridiculous policy is more effort to implement. They just blindly follow whatever MS prescribes and have long given up on making any sense of the digital world.
most people probably won't know MS is doing this at all until their data is leaked
Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.
My assumption is that when this feature is on and you turn it off, they end up deleting the tags (since you've revoked permission for them to tag them). If it gets turned back on again, I assume that means they need to rescan them. So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.
Their disclaimer already suggests they don't train on your photos.
This is Microsoft. They have a proven record of turning these toggles back on automatically without your consent.
So you can opt out of them taking all of your most private moments and putting them into a data set that will be leaked, but you can only opt out 3 times. What are the odds a "bug" (feature) turns it on 4 times? Anything less than 100% is an underestimate.
And what does a disclaimer mean, legally speaking? They won't face any consequences when they use it for training purposes. They'll simply deny that they do it. When it's revealed that they did it, they'll say sorry, that wasn't intentional. When it's revealed to be intentional, they'll say it's good for you so be quiet.
13 replies →
> to prevent wasted processing.
If that was the case, the message should be about a limit on re-enabling the feature n times, not about turning it off.
Also the if they are concerned about processing costs, the default for this should be off, NOT on. The default should for any feature like this that use customers personal data should be OFF for any company that respects their customers privacy.
> You are trying to reach really far out to find a plausible
This behavior tallies up with other things MS have been trying to do recently to gather as much personal data as possible from users to feed their AI efforts.
Their spokes person also avoided answering why they are doing this.
On the other hand, you comment seem to be trying to reach really far trying to find portray this as normal behavior.
That they limit opt-outs instead of opt-ins, when the opt-in is the only plausibly costly step, speaks for itself.
Yeah exactly. Some people have 100k photo collections. The cost of scanning isn’t trivial.
They should limit the number of times you turn it on, not off. Some PM probably overthought it and insisted you need to tell people about the limit before turning it off and ended up with this awkward language.
2 replies →
If it was that simple, there would be no practical reason to limit that scrub to three ( and in such a confusion inducing ways ). If I want to waste my time scrubbing, that should be up to me -- assuming it is indeed just scrubbing tagged data, because if anything should have been learned by now, it is that:
worst possible reading of any given feature must be assumed to the detriment of the user and benefit of the company
Honestly, these days, I do not expect much of Microsoft. In fact, I recently thought to myself, there is no way they can still disappoint. But what do they do? They find a way damn it.
18 replies →
> Their disclaimer already suggests they don't train on your photos.
We know all major GenAI companies trained extensively on illegally acquired material, and they were hiding this fact. Even the engineers felt this isn't right, but there were no whistleblowers. I don't believe for a second it would be different with Microsoft. Maybe they'd introduce the plan internally as a kind of CSAM, but, as opposed to Apple, they wouldn't inform users. The history of their attitude towards users is very consistent.
1 reply →
Then you would limit the number of times the feature can be turned on, not turned off. Turned off uses less resources, while turned on potentially continues using their resources. Also I doubt if they actually remove data that requires processing to obtain, I wouldn't expect them to delete it until they're actually required to do so, especially considering the metadata obtained is likely insignificant in size compared to the average image.
It's an illusion of choice. For over a decade now companies either are spamming you with modals/notifications up until you give up and agree to a compromising your privacy settings or "accidentally" turn these on and pretend that change happened by a mistake or bug.
Language used is deceptive and comes with "not now" or "later" options and never a permanent "no". Any disagreement is followed by a form of "we'll ask you again later" message.
Companies are deliberately removing user's control over software by dark patterns to achieve their own goals.
Advanced user may not want to have their data scanned for whatever reasons and with this setting it cannot control the software because vendor decided it's just 3 times and later settings goes permanent "on".
And considering all the AI push within Windows, Microsoft products is rather impossible to assume that MS will not be interested in training their algorithms on their customers/users data.
---
And I really don't know how else you can interpret this whole talk with an unnamed "Microsoft's publicist" when:
> Microsoft's publicist chose not to answer this question
and
> We have nothing more to share at this time
but as a hostile behavior. Of course they won't admit they want your data but they want it and will have it.
2 replies →
Your explanation would make sense if the limit was on turning the feature on. The limitation is on turning it off.
> So in effect, it sounded to me like a limit on how many times you can toggle this feature to prevent wasted processing.
That would be a limit on how many times you can enable the setting, not preventing you from turning it off.
4 replies →
> Can someone explain to me why the immediate perception is that this is some kind of bad, negative, evil thing? I don't understand it.
I bet you "have nothing to hide".
We work with computers. Every thing that gets in the way of working is wasting time and nerves.
It sounds like you have revoked their permission to tag(verb) the photos, why should this interfere with what tag(noun) the photo already has?
But really I know nothing about the process, I was going to make an allegory about how it would be the same as adobe deleting all your drawings after you let your photoshop subscriptions lapse. But realized that this is exactly the computing future that these sort of companies want and my allegory is far from the proof by absurdity I wanted it to be. sigh, now I am depressed.
> Their disclaimer already suggests they don't train on your photos.
Did you read it all ? They also sugest that they care about your privacy. /s
Favebook introducing photo tagging was when I exited Facebook.
This was pre-AI hype, perhaps 15 years ago. It seems Microsoft feel it is normalised. More you are their product. It strikes me as great insecurity.
Honestly, I hated when they removed automatic photo tagging. It was handy as hell when uploading hundreds of pictures from a family event, which is about all I use it for.
> I know who the people in my photographs are. Why is Microsoft so eager to also be able to know this?
Presumably it can be used for filtering as well - find me all pictures of me with my dad, etc.
Sure but if it was for your benefit, not theirs, they wouldn't force it on you.
1 reply →
You seem to be implying that users won't accept this. But users have accepted all the other bullshit Microsoft has pulled so far. It genuinely baffles me why anyone would choose to use their products yet many do and keep making excuses why alternatives are not viable.
Tip:
If you don't trust Microsoft but need to use Onedrive, there are encrypted volume tools (e.g. Cryptomator) specifically designed for use with Onedrive.
It's rather annoying that high-entropy files (also known as encrypted files... unknown magic header files) in OneDrive trigger ransomware protection.
I assume this would be a ... call it feature for now, so a feature not available in the EU due to GDPR violations.
My initial thoughts were so they could scan for csam while pretending as if users have a choice to not have their privacy violated.
From my understanding, CSAM scanning is always considered a separate, always on and mandatory subsystem in any cloud storage system.
23 replies →
"You can only turn off this setting 3 times a year."
I look forward to getting a check from Microsoft for violating my privacy.
I live in a state with better-than-average online privacy laws, and scanning my face without my permission is a violation. I expect the class action lawyers are salivating at Microsoft's hubris.
I got $400 out of Facebook because it tagged me in the background of someone else's photo. Your turn, MS.
Draconian Hobson's choice foisted upon users by technofeudal overlords. You are the product.
They could have avoided the negative press by changing the requirement to be that you can’t re-enable the feature after switching it off 3 times per year.
It’s not hard to guess the problem: Steady state operation will only incur scanning costs for newly uploaded photos, but toggling the feature off and then on would trigger a rescan of every photo in the library. That’s a potentially very expensive operation.
If you’ve ever studied user behavior you’ve discovered situations where users toggle things on and off in attempts to fix some issue. Normally this doesn’t matter much, but when a toggle could potentially cost large amounts of compute you have to be more careful.
For the privacy sensitive user who only wants to opt out this shouldn’t matter. Turn the switch off, leave it off, and it’s not a problem. This is meant to address the users who try to turn it off and then back on every time they think it will fix something. It only takes one bad SEO spam advice article about “How to fix _____ problem with your photos” that suggests toggling the option to fix some problem to trigger a wave of people doing it for no reason.
> Turn the switch off, leave it off, and it’s not a problem.
Assuming that it doesn't mysteriously (due to some error or update, no doubt) move back to the on position by itself.
I cancelled Facebook in part due to a tug-of-war over privacy defaults. They kept getting updated with some corporate pablum about how opting in benefited the user. It was just easier to permanently opt out via account deletion rather than keep toggling the options. I have no doubt Microsoft will do the same. I'm wiping my Windows partition and loading Steam OS or some variant and dual booting into some TBD Linux distro for development.
When I truly need Windows, I have an ARM VM in Parallels. Right now it gets used once a year at tax time.
3 replies →
Oh the one you toggle will be off.
But tomorrow they’ll add a new feature, with a different toggle, that does the same thing but will be distinct enough. That toggle will default on, and you’ll find it in a year and a half after it’s been active.
Control over your data is an illusion. The US economy is built upon corporations mining your data. That’s why ML engineers got to buy houses in the 2010s, and it’s why ML/AI engineers get to buy houses in the 2020s.
I agree this is a concern, but it frustrates me that tech companies won't give us reasonable options.
- "Scan photos I upload" yes/no. No batch processing needed, only affects photos from now on.
- "Delete all scans (15,101)" if you are privacy conscious
- "Scan all missing photos (1,226)" can only be done 3x per year
"But users are dummies who cannot understand anything!" Not with that attitude they can't.
> - "Scan photos I upload" yes/no. No batch processing needed, only affects photos from now on.
This would create a situation where some of the photos have tags and some don’t. Users would forget why the behavior is different across their library.
Their solution? Google it and start trying random suggestions. Toggle it all on and off. Delete everything and start over with rescanning. This gets back to the exact problem they’re trying to avoid.
> - "Scan all missing photos (1,226)" can only be done 3x per year
There is virtually no real world use case where someone would want to stop scanning new photos but also scan all photos but only when they remember to press this specific button. The number of users who would get confused and find themselves in unexpected states of half-scanned libraries would outweigh the number of intentional uses of this feature by 1000:1 or more.
1 reply →
Tell you what, Microsoft: turn it off, leave it off, remove it, fire the developers who made it, forget you ever had the idea. Bet that saved some processing power?
Most of us wouldn't mind if the limitation was that you can't opt IN more than 3 times/year, but of course Microsoft dark patterned it to limit the opt outs.
That's would be a wild way to implement this feature.
I mean it's Microsoft so I wouldn't be surprised if it was done in the dumbest way possible but god damn this would be such a dumb way to implement this feature.
This would be because of the legal requirement to purge (erase) all the previous scan data once a user opts out. So the only way to re-enable is to scan everything again — unless you have some clever way I’ve not thought of?
6 replies →
Disabling the feature would purge the data. That’s the intent.
If disabling the feature kept the data, that would be a real problem.
I don’t know why you think it’s dumb that they purge the data when you turn a feature off. That’s what you want.
1 reply →
[dead]
You can really tell that Microsoft has adopted advertising as a major line of business.
The privacy violations they are racking up are very reminiscent of prior behavior we've seen from Facebook and Google.
And not just advertising. If ICE asks Microsoft to identify accounts of people who have uploaded a photo of "Person X", do you think they're going to decline?
They'd probably do it happily even without a warrant.
I'd bet Microsoft is doing this more because of threats from USG than because of advertising revenue.
> They'd probably do it happily even without a warrant
I'm old enough to remember when companies were tripping over themselves after 9/11 trying to give the government anything they could to help them keep an eye on Americans. They eventually learned to monetize this, and now we have the surveillance economy.
ICE don't have to ask for anything, the USG gets a copy of all data Microsoft collects from you, anyway. Remember:
https://www.pcmag.com/news/the-10-most-disturbing-snowden-re...
I don't understand how america works, but surely Microsoft isn't paid for data requested by whatever agency?
> They'd probably do it happily even without a warrant.
...and build them a nice portal to submit their requests and get the results back in real time.
> and follow Microsoft's compliance with General Data Protection Regulation
Not in a million years. See you in court. As often, just because a press statement says something, it's not necessarily true and maybe only used to defuse public perception.
Truly bizarre. I'm so glad I detached from Windows a few years back, and now when I have to use it or another MS product (eg an Xbox) it's such an unpleasant experience, like notification hell with access control checks to read the notifications.
The sad thing is that they've made it this way, as opposed to Windows being inherently deficient; it used to be a great blend of GUI convenience with ready access to advanced functionality for those who wanted it, whereas MacOS used to hide technical things from a user a bit too much and Linux desktop environments felt primitive. Nowadays MS seems to think of its users as if they were employees or livestock rather than customers.
Microsoft in the past few years has totally lost it's mind, it's ruining nearly everything it touches and I can't understand why
They are like shitty Midas, everything they touch, turns into pile of crap. However people stil buy their products. They think the turd is tasty, because billion of flies can't we wrong...
Meanwhile Apple is applying different set of toxic patterns. Lack of interoperability with other OS, their apps try to store data mainly on iCloud, iPhone has no jack connector etc.
They never changed. For some reason Satya became CEO and nerds fawned over the “new Microsoft” for whatever reason.
They are a hard nosed company focused with precision on dominance for themselves.
Insider here, in m365 though not onedrive. It did change, but not because of satya ; because of rules and legislation and bad press. Privacy and security are taken very seriously (at least by people who care to follow internal rules) not because "we're nice", but because
- EU governments keep auditing us, so we gotta stay on our toes, do things by the book, and be auditable - it's bad press when we get caught doing garbage like that. And bad press is bad for business
In my org, doing anything with customer's data that isn't directly bringing them value is theoretically not possible. You can't deliver anything that isn't approved by privacy.
Don't forget that this is a very big company. It's composed of people who actually care and want to do the right thing, people who don't really care and just want to ship and would rather not be impeded by compliance processes, and people who are actually trying to bypass these processes because they'd sell your soul for a couple bucks if they could. For the little people like me, the official stance is that we should care about privacy very much.
Our respect for privacy was one of the main reasons I'm still there. There has been a good period of time where the actual sentiment was "we're the good guys", especially when comparing to google and Facebook. A solid portion of that was that our revenue was driven by subscriptions rather than ads. I guess the appeal to take customer's money and exploit their data is too big. The kind of shit that will get me to leave.
9 replies →
Do we even think that was real? I think social media has been astroturfed for a long time now. If enough people make those claims, it starts to feel true even without evidence to support it.
Did they ever open source anything that really make you think "wow"? The best I could see was them "embracing" Linux, but embrace, extend, extinguish was always a core part of their strategy.
Money and power. Who was the first BigTech co on the Prism slides? Who muscled out competitors in the 90s?
Microsoft wants money. Microsoft does not care about you.
> Microsoft in the past few years has totally lost it's mind
I don't know what this Microsoft thing is that you speak of. I only know a company called Copilot Prime.
This week I have received numerous reminders from Microsoft to renew my Skype credit..
Everything I see from that company is farcical. Massive security lapses, lazy AI features with huge privacy flaws, steamrolling OS updates that add no value whatsoever, and heavily relying on their old playbook of just buying anything that looks like it could disrupt them.
P.S. The skype acquisition was $8.5B in 2011 (That's $12.24B in today's money.)
CEO v3.0 Satya is the reason. He can't innovate, he can only play 'chase the leader'
I don't understand how this is losing their mind. Toggling this setting is expensive on the backend: opting in means "go and rescan all the photos". opting out means "delete all the scanned information for this user". As a user just make up your mind and set the setting. They let you opt in, they ley you opt out, they just don't want to let you trigger tons of work every minute.
If this was the case, they would leave it in the off state after you run out of toggles. The reality is that it will magically turn on every month.
I don't understand how you think repeating this nonsense excuse for an argument will achieve anything.
There was a time with a strong sentiment of Satya Nadella making MS great again.
Oh what time does to things!
By each passing day since I switched from using Windows to Linux at home, with decreasing friction, I am increasingly happy that I took time to learn Linux and stuck with it. This not a come to Linux call because I know it is easier said than done for most of non technical folks. But it is a testimony that if you do, the challenges eventually will be worth it. Because at this point, Microsoft is just openly insulting their captive users.
You know, in 90s in Russia in IT circles Windows was known as "маздай" which is a transliteration of "must die".
Looks like nothing has changed.
Do you think the PR person responding here feels, underneath it all, the inhumanity of their responses? The fact that they're merely wasting everyone's time with their prevaricated non-answers? Knowing what they need to say to keep their job but hurting internally at the stupidity of it all.
Or do they end up so enmeshed with the corporate machine that that they start to really believe it all makes sense?
I think - at least for the people who stick with a career in PR - that they enjoy playing the game of giving an answer that is sort of related to the question but doesn't actually give a single bit of useful information. That they enjoy seeing how far they can push it without the interviewer straight up accuse them of not answering the question.
At least that's the only way I can imagine them keeping their sanity.
It's in their job description, they're most likely very proud of how their words can swindle the majority. They're greasy and they love it.
I think HN skews towards a somewhat naive but good natured crowd. Every time ethics or morality comes up on here there is no shortage of defenders that simply don't want to accept the fact. Yes there are bad people out there that are not only ok with the bad things they do but even some that actively enjoy it and pursue more of it.
3 replies →
Stop anthropomorphizing Microsoft PR speakers.
Did anyone notice that Microsoft never replied any of the asked questions, but deflected them?
They are exactly where I left them 20 years ago.
It's very sad that I can't stop using them again for doing this.
This is such a norm in society now; PR tactics take priority over any notion of accountability, and most journalists and publishers act as stenographers, because challenging or even characterizing the PR line is treated as an unjustified attack and inflated claims of bias.
Just as linking to original documents, court filings etc. should be a norm in news reporting, it should also be a norm to summarize PR responses (helpful, dismissive, evasive or whatever) and link to a summary of the PR text, rather than treating it as valid body copy.
People need to treat PR like they do AIs. "You utterly failed to answer the question, try again and actually answer the question I asked this time." I'd love to see corporate representatives actually pressed to answer. "Did you actually do X, yes or no, if you dodge the question I'll present you as dodging the question and let people assume the worst."
7 replies →
They take people for idiots. This can work a few times, but even someone who isn't the brightest will eventually put two and two together when they get screwed again and again and again.
It's not just PR tactics for the sake of accountability. It's because there's a glut of lawyers that'll sue for the tinest admission of anything.
1 reply →
link to a summary of the PR text
Should have just said 'link to a screenshot of the PR text', apologies for the confusion
The worst part of all this is even respectable news organisations like the BBC publish so many articles that are just the companies PR response verbatim. Even worse when it's like
- victim says hi, this thing is messed up and people need to know about this
-Company says "bla bla bla" legal speak we don't recognise an issue "bla bla bla"
End of article, instead of saying
"This comment doesn't seem to reflect the situation" or other pointing out that anybody with a brain can see the two statements are not equal in evidence nor truth
They prevaricated all of their answers, and that itself is far more telling.
I was afraid for the EU economy, but after this declaration I'm reassured that Microsoft will pay for my grand kids' education in 30 years.
I think the EU is flawed in more ways that just one. But every time I see „<AI feature> will be available starting now outside EU“ I am really grateful
Meta just lost a court case against bits of freedom in the Netherlands, because their instagram setting to turn off the attention grabbing feed would reset every month or so. The court ruled that this infringed on the user’s freedom.
Source: https://www.dutchnews.nl/2025/10/court-tells-meta-to-give-du...
Microsoft gets a lot less difficult to reason about when we start to think of it as a statistical mean of human nature rather than the mind of one arbitrary evil bastard. They have 228k employees. The CEO has virtually zero direct influence on the end work product of any team.
Any organization this large is going to have approximately the same level of dysfunction overall. But, there are almost always parts of these organizations where specific leaders have managed to carve out a fiefdom and provide some degree of actual value to the customer. In the case of Microsoft, examples of these would be things like .NET, C#, Visual Studio [Code], MSSQL, Xbox.
Windows, Azure & AI are where most of the rot exists at Microsoft. Office is a wash - I am not a huge fan of what has happened to my Outlook install over the years, but Teams has dramatically stabilized since the covid days. Throwing away the rest of the apple because of a few blemishes is a really wasteful strategy.
Growing up, Microsoft dominance felt so strong. 3 decades later, there’s a really high chance my kids will never own or use a windows machine (unless their jobs gives them one).
Do you remember this: http://toastytech.com/evil/index.html ?
Microsoft hate was something else in the '90s and 2000s. Yet people stayed with it as if they had no choice while OS/2, AmigaOS, NextStep, BeOS and all those UNIXes died.
A lot of people didn't and still don't. Sometimes your job/business requires certain software that is only available on windows. I'm not giving up my job for an OS. for the past 15 years or so I could do everything on Mac and Linux, but that might not always be the case. I certainly wouldn't pass up a lucrative consulting position because it was windows only.
an employer requires their workers to use Windows; the target audience for Windows is management, their HR and attorneys, and then greater security services. MSFT sells investigative services.
Back then people really didn't have much of a choice.
Nowadays most things happen in browsers anyways, WINE/Proton have come a long way, and alternatives to almost anything windows-only have reached a critical quality threshold.
>unless their jobs gives them one
Microsoft knows the vast majority of professionals are forced to use their products and services or else they can't put food on the table. That's why Microsoft can operate with near impunity.
Does this mean that when you disable, all labels are deleted, and when you turn it back on it has to re-scan all of your photos? Could this be a cost-saving measure?
In that case, they should make it the other way around — you can enable this only three times a year.
They should do it the other direction, then: if you turn it off more than three times you can’t turn it back on.
But that's less good for profit. Why would they give up money for morals?
1 reply →
No, it's a profit-seeking measure.
>Does this mean that when you disable, all labels are deleted
AHHAHAHAHAHAHAHAHA.
Ha.
Nice one.
It’s like expecting a lion to stop eating you if you ask it politely.
There's a great solution to this.
Just stop using Microsoft shit. It's a lot easier than untangling yourself from Google.
Yeah it is legitimately hard to avoid Google, if nothing else some of your emails will probably be leaked to Gmail.
But Microsoft is pretty easy to avoid after their decade of floundering.
Whenever I have to use Windows, I just create a new throwaway account on proton, connect it to the mother throwaway account connected to a yahoo email account created in the before times, install what I need, and then never access that account again.
1 reply →
How can I play starcraft 2 without it?
3 replies →
Yes. Just use Immich for photos. AI scanning, but local and only opt-in.
Is there a free platform that will let me blog like GitHub Pages works?
Gitlab. Codeberg. Neocities. Nekoweb. Wasmer. Surge. Digital Ocean. Freehostia. Awardspace. 000webhost. Static.run. Kinsta. Cloudflare Pages. Render. Hostinger. Ionos. Bluehost. Firebase. Netlify. Orbiter. Heliohost. There's probably hundreds of services with a free tier these days (though many of them will have strict limitations on website size and traffic, and you may have to run the build step locally).
https://pico.sh/
you mean like stop using GitHub?
Yes.
For private repos there is Forgejo, Gitea and Gitlab.
For open-source: Codeberg
Yes, it'll make projects harder to discover, because you can't assume that "everything is on github" anymore. But it is a small price to pay for dignity.
2 replies →
Yes, that too.
How is this not a revenge porn or something? If I upload sensitive photos somewhere, it is 5 years prison sentence! CEO of Microsoft can do that billion times!
Security issue for targeted people. What if an MS account gets compromised and a bad actor plants illegal material on the computer that is then scanned by the cloud before it is caught.
I was quite happy for a couple years to just use windows and wsl. Fully switched to Linux at home and Linux VM's at work. The thirst and desperation to make AI work gives me the creeps more than usual.
Microsoft: forces OneDrive on users via dark pattern dialogs that many users just accept
Users: save files "on their PC" (they think)
Microsoft: Rolls out AI photo-scanning feature to unknowing users intending to learn something.
Users: WTF? And there are rules on turning it on and off?
Microsoft: We have nothing more to share at this time.
Favorite quote from the article:
> [Microsoft's publicist chose not to answer this question.]
Almost feel like we are getting to class action or antitrust when you connect the dots. Almost all PCs come with Windows. Defacto you need to create a M$ account to use Windows locally. They opt you into one drive by default. They sync your docs by default. They upload all your photos into AI by default.
By "class action" I presume you're referring to the US. If so, no, the courts of law are forbidden to you. You will instead go to a secret tribunal where the laws do not matter. The arbiter will only continue to be paid if they continue to rule for corporations.
https://www.microsoft.com/en-us/servicesagreement#15_binding...
1 reply →
You can use Windows without a Microsoft account, but the dark pattern to do this is very difficult to navigate.
1 reply →
Tell them "you may only refuse to answer this question 3 times a year".
It's totally worth self hosting files, it's gotten much better.
What happens to the faces in photos that do not belong to the photo owner?
Do they get scanned as well without the person's permission?
> Slashdot: What's the reason OneDrive tells users this setting can only be turned off 3 times a year? (And are those any three times — or does that mean three specific days, like Christmas, New Year's Day, etc.)
> [Microsoft's publicist chose not to answer this question.]
Microsoft's understanding of consent is about on-par with that of a rapist.
I say this about advertising and after recently using Win11 for the first time to remove malware, I was left with a gross feeling. My friend whose computer it was is not highly PC literate, but when I was talking about the AI shit built in to these platforms, you could see the disgust building.
This doesn't feel like a problem at all. I only need to turn the setting off once, right? My immediate question to seeing that verbiage was, "how many times does the setting turn itself on in a year?"
This made me look up if you can disable iOS photo scanning and you can’t. Hmm.
We created this oligopoly because they were convenient, free, powerful, and now its time for us to pay the price.
Or find services that may not be as easy to use, may cost something and may not have all the features you want, but which wont make unreasonable demands for your data.
In light of the way the US government is carrying on, I'd rather not give Microsoft any of my images.
> In light of the way the US government is carrying on, I'd rather not give Microsoft any of my images.
What is this supposed to mean? That you'd be happier with the dystopia if they were going after people you like less?
"It's not your data citizen, you should be happy we made this OS for you. You are not smart enough to do it your self, we know what is best."
I can never help myself from hearing this inside, and am just incredibly thankful that we have Linux and FOSS in general. That really gives me hope for humanity at this point.
I type this in FireFox, on NixOS, with all my pics open in another tab, in Immich. Thank you, thank you, thank you.
Mozilla doesn't think it's your data either.
Why does anyone still run Windows?
Games I guess.
Both Mac and Linux desktop/laptop machines are better and less loaded with shit. If you don’t need or want a full featured PC you have Android and iOS which are also better. Android you have to be careful of but if you pick well it can be customizable and less loaded with shit.
Steam is available for both Linux and macOS. Are there just not as many game titles? I just saw Cyberpunk show up in the Apple Store for Mac so there seems to be an effort to port more games off Windows.
I have a Windows VM but use it less and less. Only need now is to test and build some software for Windows.
Also: I realized what I do kind of like about Apple and how best to describe their ecosystem. It’s the devil you know. They are fairly consistent in their policies and they are better on privacy than others. Some of their policies suck, but they suck in known consistent ways.
If I left Apple, Linux (probably on Framework) is the only alternative.
> Why does anyone still run Windows?
Learned helplessness.
> Steam is available for both Linux and macOS. Are there just not as many game titles?
A vast majority of the games work fine under Linux now, in fact most release these days woke even on day one of release. The only games that don't really work are ones which use invasive kernel-level anti-cheat systems.
> Why does anyone still run Windows? > Games I guess.
Music, and video.
Disabling offline accounts is one thing but scanning and labeling your files to profile users is a whole other can of worms. This trajectory leads to zero privacy for the user and I feel like switching to Linux/Mac will be the only option sadly.
Obviously the whole point is to make AI overreach avoidance as painful as possible.
Of course, that's also the reason why Lens was deprecated despite being a good, useful app, forcing one to deal with the bload of Copilot 365.
It really seems as though Microsoft has total contempt for their retail/individual customers. They do a lot to inconvenience those users, and it often seems gratuitous and unnecessary. (As it does in this case.)
...I guess Microsoft believes that they're making up for it in AI and B2B/Cloud service sales? Or that customers are just so locked-in that there's genuinely no alternative? I don't believe that the latter is true, and it's hard to come back from a badly tarnished brand. Won't be long before the average consumer hates Microsoft as much as they hate HP (printers).
That’s not opt out. Opt out is the ability to say no. If you’re not allowed to say no there’s no consent and you’re being forced.
If you opt out and then never turn it back on, you have opted out.
Because Microsoft is known to respect user settings between (forced) Windows Updates and not turn stuff back on...
Year of the Linux desktop edges ever closer.
Seems obvious they actually mean to limit the number of times you can opt in. Very poor choice of words.
The difference is whether you get locked into having it on or having it off at the end.
> You can only turn off this setting 3 times a year.
Who's making the t-shirts? Don't forget the Microsoft logo. They're proud of this!
In my head it's sounding like that Christmas jingle. It's the most wonderful time of the year!
EU please whack them and whack them good
You should ask the EU what happed with the Digital Markets Act, and if it's possible to install arbitrary software on an iPhone today. If the answer is "no, you can't", then that should give you a hint of how effective the EU really is when it comes to these issues.
I don't really see the issue. If you don't want the face recognition feature, then you'll turn it off once, and that's that. Maybe if you're unsure, you might turn it off, and then back on, and then back off again. But what's the use case where you'd want to do this more than 3x per year?
Presumably, it's somewhat expensive to run face recognition on all of your photos. When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again.
If this is the true reason, then they have made some poor decisions throughout that still deserve criticism. Firstly by restricting the number of times you can turn it _off_ rather than _on_, secondly by not explaining the reason in the linked pages, and thirdly by having their publicist completely refuse to say a word on the matter.
In fact, if you follow the linked page, you'll find a screenshot showing it was originally worded differently, "You can only change this setting 3 times a year" dating all the way back to 2023. So at some point someone made a conscious decision to change the wording to restrict the number of times you can turn it _off_
Maybe what they see is that most people who turn it off will leave it off, but some people turn it off and turn it back on as a part of a pattern/habit around temporarily putting files on OneDrive they don't want to scan.
For example, people who don't use their encrypted vault on OneDrive, so they upload photos that should otherwise be encrypted to their normal OneDrive which gets scanned and tagged. It could be a photo of their driver's license, social security card, or something illicit.
So these users toggle the tagging feature on and off during this time.
Maybe the idea is to push these people's use case to the vault where it probably belongs?
Well, sometimes Microsoft decides to change your settings back. This has happened to me very frequently after installing Windows updates. I remember finding myself turning the same settings off time and again.
The "fuck you, user!" behavior of software companies now means there's no more "No", only "Maybe later". Every time I update Google Photos, it shows me the screen that "Photos backups are not turned on! Turn on now?" (because they want to upsell their paid storage space option).
6 replies →
> If you don't want the face recognition feature, then you'll turn it off once.
The issue is that is a feature that 100% should in any sane world be opt in - not opt out.
Microsoft privacy settings are a case of - “It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard.”
There's inherently nothing wrong with face recognition, I love being able to search my own photos on my iPhone. If you could keep it private, you totally would too.
Even KDE's Digikam can run "somewhat expensive" algorithms on your photos without melting your PC and making you wait a year to recognize and label faces.
Even my 10(?) year old iPhone X can do facial recognition and memory extraction on device while charging.
My Sony A7-III can detect faces in real time, and discriminate it from 5 registered faces to do focus prioritization the moment I half-press the shutter.
That thing will take mere minutes on Azure when batched and fed through GPUs.
If my hunch is right, the option will have a "disable AI use for x months" slider and will turn itself on without letting you know. So you can't opt out of it completely, ever.
> When you turn it off, they have to throw away the index (they'd better be doing this for privacy reasons), and then rebuild it from scratch when you turn the feature on again
This is probably the case. But Redmond being Redmond, they put their foot in their mouth by saying "you can only turn off this setting 3 times a year" (emphasis mine).
Agreed, in practice for me there's no real issue.
But that's not necessarily true for everyone. And it doesn't need to be this way, either.
For starters I think it'd help if we understood why they do this. I'm sure there's a cost to the compute MS spends on AI'ing all your photos, turning it off under privacy rules means you need to throw away that compute. And turning it back on creates an additional cost for MS, that they've already spent for nothing. Limiting that makes sense.
What doesn't make sense is that I'd expect virtually nobody to turn it on and off over and over again, beyond 3 times, to the point that cost increases by more than a rounding error... like what type of user would do that, and why would that type of user not be exceedingly rare?
And even in that case, it'd make more sense to do it the other way around: you can turn on the feature 3 times per year, and off anytime. i.e. if you abuse it, you lose out on the feature, not your privacy.
So I think it is an issue that could and should be quickly solved.
> what's the use case where you'd want to do this more than 3x per year?
That means that all Microsoft has to do to get your consent to scan photos is turn the setting on every quarter.
So why not limit how many times you can turn it on, instead of off?
We all know why.
The point is it’s sucking your data into some amorphous big brother dataset without explicitly asking you if you want that to happen first. Opt out AI features are generally rude, trashy, low-class, money grubbing data grabs
To prevent you from having the option to temporarily disable it, so you have to choose between privacy and the supposed utility
Right, while I understand the potential compute cost, it would be like the iPhone restricting the number of times you could use “allow once“ for location permissions.
Presumably, it's somewhat expensive to run face recognition on all of your photos.
Very likely true, but we shouldn't have to presume. If that's their motivation, they should state it clearly up front and make it opt-out by default. They can put a (?) callout on the UI for design decisions that have external constraints.
Assuming this reasoning is accurate, why not just silently throw a rate limit error and simply not reenable it if it's repeatedly switched on and off?
I wonder if it's possible to encrypt the index with a key that's copied to the user's device, and if the user wants to turn off this setting, delete the key on the server. When they want to turn it back on, the device uploads the key. Yes, the key might end up gone if there's a reinstall, etc.
If the user leaves it off for a year, then delete the encrypted index from the server...
How hard it to turn it on? Does it show a confirmation message?
My wife has a phone with a button on the side that opens the microphone to ask questions to Google. I guess 90% of the audios they get are "How the /&%/&#"% do I close this )(&(/&(%)?????!?!??"
I bought a new Motorola phone and there are no less than three ways to open Google assistant (side button, hold home button, swipe from corner). Took me about 10 seconds before I triggered it unintentionally and quickly figured out how to disable all of them...
"When this feature is disabled, facial recognition will be disabled immediately and existing recognition data will be purged within 60 days". Then you don't need a creepy message. Okay, so that's 6 times a year, but whatever.
So the message is: if you can, don't use OneDrive.
If you can't (work, etc.) try to avoid uploading sensitive documents in onedrive.
I always wondered who uses OneDrive for cloud storage. Hell, I think even Google Drive is better.
Microsoft has really pivoted to AI for all things. I wonder how many customers they will get vs how many they will lose due to this very invasive way of doing things.
This is once again strongly suggesting that Microsoft is thoroughly doomed if the money they've dumped into AI doesn't pan out. It seems to me that if your company is tied to Microsoft's cloud platform, you should probably consider moving away as quickly as you can. Paying the vmware tax and moving eveyrthing in house is probably a better move at this point.
> I uploaded a photo on my phone to Microsoft's
That's your problem right there.
> Microsoft only lets you opt out of AI photo scanning
Their _UI_ says they let you opt out. I wouldn't bet on that actually being the case. At the very least - a copy of your photos goes to the US government, and they do whatever they want with it.
I wonder if you can write a program to make pictures with face tattoos be the normal for Microsoft AI to train on, like see if enough people did this, if Microsoft's facial recognition started generating lots of face tats...
Fedora with vanilla Gnome is excellent for anyone looking for an alternative.
503 service unavailable - did Slashdot get HNed? Ironic. (Probably not - it's probably unrelated)
This sounds like the next level of the nauseating “maybe later”.
i.e. You’ll do what we tell you eventually.
Makes me want to download and install windows, and store a picture of my hairy brown nutsack with googly eyes on it.
Cue Nair and football season ads
Microsoft is such a scummy company. They always were but they've become even worse since they've gone all in on AI.
I wonder if this is also a thing for their EU users. I can think of a few laws this violates.
It's enough to make a man consider Linux...
A man perhaps, but not the average frog. The frog will continue to insist that it will freeze if it dares to step out of the familiar warm pot.
I think a call to Australia’s privacy commissioner might be in order.
What are they gonna do? Hard to have a convo with your master when youre on your knees...
Fine the living daylights out of them. They are quite willing and able to do so.
Well put.
1 reply →
Microsoft gets most of its money from big corporate customers. Some of those customers are obligated by law to not leak sensitive personal data to servers in USA soil, because those customers have the missfortune of being in countries with strong privacy laws, functioning civil societies and sometimes even left-winged governments. I know for a fact that the product in question, "OneDrive", it's sometimes mandated in those companies as a backup solution for the company's computers. All it takes is a whistle-blowing incident or a chat with a journalist for this to become a major blow-up for Microsoft, with companies forced by tribunals to back off from contracts with Microsoft.
To believe I was paying for this yearly
Unbelievable
Presumably you just need to turn it off once, right?
Crossposting slashdot?
Heaven forfend!
They are the ones who did this interview
Why would anyone use this crap at this point? Buy a (possible used) mini PC or thin client, install Linux and Samba on it, and voila, your own private "cloud" completely free of corporate interference, spyware and recurring fees. This works best with a static IP for remote access via Wireguard but it can be made to work on a residential connection.
With a little more effort you can deploy Nextcloud, Home Assistant and a few other great FOSS projects and completely free yourself from Big Tech. The hardest part will probably be email on a residential connection, but it can be done with the help of a relay service for outgoing mail.
Isn't it cute when there's absolutely no rationale behind a new rule, and it's simply an incursion made in order to break down a boundary?
Look, scanning with AI is available!
Wow, scanning with AI is now free for everyone!
What? Scanning with AI is now opt-out?
Why would opting-out be made time-limited?
WTF, what's so special about 3x a year? Is it because it's the magic number?
Ah, the setting's gone again, I guess I can relax. I guess the market wanted this great feature, or else they wouldn't have gradually forced it on us. Anyway, you're a weird techie for noticing it. What do you have to hide?
There is a big rationale behind it. If their AI investments don't pan out, Microsoft will cease to exist. They've been out of ideas since the late 90s. They know that the subscription gravy train has already peaked. There is no more growth unless they fabricate new problems for which they will then force you to pay for the solution to the problem they created for you. Oh, your children were kidnapped because Microsoft sold their recognition and location data to kidnappers? Well you should have paid for Microsoft's identity protection E7 plus add-on subscription that prevents them from selling the data you did not authorize them to collect to entities that they should know better than to deal with.
I don't even get why they would need "ideas" or "growth" tbh. They have most popular desktop operating system, they have one of the most popular office suites, surely they make plenty of profit from those. If they just focused on making their existing products not shit they would remain a profitable company indefinitely. But instead they're enshittifying everything because they want more More MORE
2 replies →
Reminder: Microsoft owns Github and NPM.
I'm kind of surprised that it is Microsoft leading the field in this. It seems like something that'd be much more at home on an Apple or Google smartphone. But I suppose smartphones don't have the hardware or network power or resources to pull this off without noticibly degrading the smartphone performance.
Slashdot: why opt-out rather than opt-in?
Microsoft: it's just a shit as Microsoft 365 and SharePoint.
I've never seen a better case for uploading endless AI slop photos.
This is your daily reminder not to use Microsoft.
fuck microsoft
Micro Soft Windows
With a name like that who needs gravity.
[dead]