← Back to context

Comment by forgotoldacc

3 days ago

This is Microsoft. They have a proven record of turning these toggles back on automatically without your consent.

So you can opt out of them taking all of your most private moments and putting them into a data set that will be leaked, but you can only opt out 3 times. What are the odds a "bug" (feature) turns it on 4 times? Anything less than 100% is an underestimate.

And what does a disclaimer mean, legally speaking? They won't face any consequences when they use it for training purposes. They'll simply deny that they do it. When it's revealed that they did it, they'll say sorry, that wasn't intentional. When it's revealed to be intentional, they'll say it's good for you so be quiet.

A bug, or a dialog box that says ”Windows has reviewed your photo settings and found possible issues. Press Accept now to reset settings to secure defaults”

This is how my parents get Binged a few times per year

  • This feels different though. Every time you turn it off and then on again it has a substantial processing cost for MS. If MS "accidentally" turns it on and then doesn't allow you to turn it off it raises the bar for them successfully defending these actions in court.

    So to me it looks like MS tries to avoid that users ram MS's infrastructure with repeated expensive full scans of their library. I would have worded it differently and said "you can only turn ON this setting 4 times a year". But maybe they do want to leave the door open to "accidentally" pushing a wrong setting to the users.

    • As stated many times elsewhere here, if that were the case, it'd be an opt in limit. Instead it's an opt out limit from a company that has a proven record of forcing users into an agreement against their will and requiring an opt out (that often doesn't work) after the fact.

      Nobody really believes the fiction about processing being heavy and that's why they limit opt outs.

      8 replies →

There’s dark pattern psychology at play here. You are very likely to forget to do something that you can only do three times a year.

The good news is that the power of this effect is lost when significant attention is placed on it as it is in this case.

Of course, the problem with having your data available even for a day or so, lets say because that day you didn't read your e-mails, will mean, that your data will be trained on, used for M$ purposes. They will have powerful server farms at the ready holding your data at gun point, so that the moment they manage to fabricate fake consent, they are there to process your data, before you can even finish reading any late notification e-mail, if any.

Someone show me any cases, where big tech has successfully removed such data from already trained models, or in case of being unable to do that with the blackboxes they create, removed the whole blackbox, because a few people complain about their data being in those black boxes. No one can, because this has not happened. Just like ML models are used as laundering devices, they are also used as responsibility shields for big tech, who rake in the big money.

This is M$ real intention here. Lets not fool ourselves.