← Back to context

Comment by talldayo

1 month ago

The right call is to provide the feature and let users opt-in. Apple knows this is bad, they've directly witnessed the backlash to OCSP, lawful intercept and client-side-scanning. There is no world in which they did not realize the problem and decided to enable it by default anyways knowing full-well that users aren't comfortable with this.

People won't trust homomorphic encryption, entropy seeding or relaying when none of it is transparent and all of it is enabled in an OTA update.

> This is what a good privacy story looks like.

This is what a coverup looks like. Good privacy stories never force third-party services on a user, period. When you see that many puppets on stage in one security theater, it's only natural for things to feel off.

> This is what a coverup looks like.

That’s starting to veer into unreasonable levels of conspiracy theory. There’s nothing to “cover up”, the feature has an off switch right in the Settings and a public document explaining how it works. It should not be on by default but that’s not a reason to immediately assume bad faith. Even the author of the article is concerned more about bugs than intentions.

  • > the feature has an off switch right in the Settings and a public document explaining how it works

    Irrelevant.

    This is Apple's proprietary software, running on Apple's computers, devices which use cryptography to prevent you from inspecting it or running software they don't control. Very few people have any idea how it actually works or what it actually does.

    That there's some "public document" describing it is not evidence of anything.

    > that’s not a reason to immediately assume bad faith

    The mere existence of this setting is evidence of bad faith. The client side scanning nonsense proved controversial despite their use of children as political weapons. That they went ahead and did this despite the controversy removes any possible innocence. It tells you straight up that they cannot be trusted.

    > Even the author of the article is concerned more about bugs than intentions.

    We'll draw our own conclusions.

  • Would you feel the same if Microsoft turned on Recall on all Windows PCs everywhere with an update?

    They worked very hard on security these past few months, so it should be all good, right?

    • That is not the point at all and you either didn’t try to understand one iota of it or are outright arguing in bad faith.

      I am not claiming for one moment that enabling this by default is OK. In fact, I have explicitly said it is not.

      What I am saying is that it is ignorant to call this a cover up, because a cover up requires subterfuge. This feature has a freaking settings toggle and public documentation. Calling it a cover up is the type of uneducated rhetoric that makes these issues being brushed off by those in power as “it’s just a bunch of loonies conspiracy theorists complaining”.

      2 replies →

  • Sure it is. This isn't a feature or setting that users check often or ever. Now, their data is being sent without their permission or knowledge.

> This is what a coverup looks like

This is a dumb take. They literally own the entire stack Photos runs on.

If they really wanted to do a coverup we would never know about it.

  • Why wouldn't this mistake be addressed in a security hotfix then? Apple has to pick a lane - this is either intended behavior being enabled against user's wishes, or unintended behavior that compromises the security and privacy of iPhone owners.

It's not that binary. Nobody is forcing anything, you can not buy a phone, you can not use the internet. Heck, you can even not install any updates!

What is happening, is that people make tradeoffs, and decide to what degree they trust who and what they interact with. Plenty of people might just 'go with the flow', but putting what Apple did here in the same bucket as what for example Microsoft or Google does is a gross misrepresentation. Present it all as equals just kills the discussion, and doesn't inform anyone to a better degree.

When you want to take part in an interconnected network, you cannot do that on your own, and you will have to trust other parties to some degree. This includes things that might 'feel' like you can judge them (like your browser used to access HN right here), but you actually can't unless you understand the entire codebase of your OS and Browser, all the firmware on the I/O paths, and the silicon it all runs on. So you make a choice, which as you are reading this, is apparently that you trust this entire chain enough to take part in it.

It would be reasonable to make this optional (as in, opt-in), but the problem is that you end up asking a user for a ton of "do you want this" questions, almost every upgrade and install cycle, which is not what they want (we have had this since Mavericks and Vista, people were not happy). So if you can engineer a feature to be as privacy-centric yet automated as possible, it's a win for everyone.

  • > What is happening, is that people make tradeoffs, and decide to what degree they trust who and what they interact with.

    People aren't making tradeoffs - that's the problem. Apple is making the tradeoffs for them, and then retroactively asking their users "is this okay?"

    Users shouldn't need to buy a new phone to circumevent arbitrary restrictions on the hardware that is their legal property. If America had functional consumer protections, Apple would have been reprimanded harder than their smackdowns in the EU.

    • People make plenty of tradeoffs. Most people trade most of their attention/time for things that are not related to thinking about technical details, legal issues or privacy concerns. None of this exists in their minds. Maybe the fact that they implicitly made this tradeoff isn't even something they are aware of.

      As for vectorised and noise-protected PCC, sure, they might have an opinion about that, but people rarely are informed enough to think about it, let alone gain the insight to make a judgment about it at all.