This distinction of changing the defacto ownership of your device and data is the real inflection point. The surveillance technology itself is not really that novel, as functionally it's applying established anti-virus techniques to data instead of code. Ask any AV company how their detection works, and it will include a variation on this.
This same tech can (and will likely) be used to find the owners of bitcoin and other cryptocurrency wallets, honeypot tokens, community identities, and to provide profiling information to the company's political masters. The collisions in the hashing scheme mean that you can insert anything you want onto peoples devices and get them pulled into the legal system once it is flagged, where the process itself is the punishment. The whole scheme is too stupid to ever have been about reason, it's just pretexts and narrative, and this is as good a time as any to exit their ecosystem.
Apple really picked the wrong time to attempt this, as I do not see anyone who understands how evil this is ever forgiving them. The most charitable thing I can say about it is that they're probably just doing it as part of a deal to avoid anti-trust plays, where Apple plays ball with the feds and its parties, and the storm just magically passes them over. The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
> The surveillance technology itself is not really that novel
What's novel is that the tech reports you to the authorities. Imagine your AV reporting you to the authorities for digital piracy, it's something that RIAA could only dream of back in the day. Now it's becoming a reality.
"We just scan every song on your iPod to make sure the neural hash isn't copyrighted content. If it is blah blah blah tokens blah blah report you to the RIAA."
Just bought a System76 laptop. Happy birthday Linux!
Not the authorities perhaps, but it is already happening in some degree: Windows Defender sending “samples” or unknown binaries to the cloud to analyze them. I am extremely bothered by this and try to disable this “feature” as much as possible. As always with Windows, you have to aggressively tweak system settings to permanently disable the constant reminders “Oops we have detected suboptimal settings, please turn on every privacy invading feature for your convenience”. We can only hope MS doesn’t abuse the samples for other purposes, but given their and other big techs track record, we can assume there are additional parties interested in the submissions.
> I do not see anyone who understands how evil this is ever forgiving them.
I'm not so optimistic.
How many people among Apple's users actually understand how evil this is, and among those, how many do really, actually care? People seem fine enough with Facebook's data vacuuming, why would they protest against Apple's "non-intrusive" scheme? They "don't hate children" and, of course, "have nothing to hide".
The issue, as has been brought up in one form or another in the numerous threads on the subject, is that people like their comfort (using smartphones) and there really isn't that much of a choice.
> The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
And therein lies the rub. Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you.
> Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you
I think they were being sarcastic, maybe not. Either way, you're right. And this is why WE need to be doing this so that it becomes a viable option.
I feel it's bit unreasonable to expect a non-technical user to even start to comprehend this issue.
Many of them know that the photos, videos, music from their old iPhone would be available in their new iPhone after they sign-in; But do they really understand what happened in-between to enable that, Should they even know that? That's what Apple is banking on.
It would be pragmatic to expect even technically equipped Apple fans to call-out Apple's latest hypocrisy and move away from the ecosystem. They didn't do it earlier, They didn't do it when it came to light that Apple knew that its contractors exploited child labor[1], They wouldn't do it now.
"get them pulled into the legal system once it is flagged, where the process itself is the punishment"
This is the real threat here. Anyone can have data flagged at any time, by accident or maliciously. Like how any video can be flagged for copyright infringement and the creator is 'punished by the process' regardless of guilt/innocence. A possible fix would be to have severe financial punishments for every false claim (lets say a million bucks per instance). Imagine how careful the system would be designed if that were the case, verses the case where there is no punishment for false claims.
Just to extend my comment in response to how this problem will spread:
By doing surveillance on types of images, Apple is in effect implementing anti-virus - for ideas. That's only a bit hyperbolic, as the perceptual hash for a viral meme can be searched on, just like the material they're using as a pretext for it.
I could even see them announcing it at a launch. We should be concerned that the company has skipped its Black Mirror stage and jumped right into its Universal Paperclips endgame.
(I'm also appreciating the irony that people like me being angry about Apple announcing they're going to implement a version of what Google has already been technically able to do for the last decade, and what Microsoft has probably been doing in secret since even before then.)
> This distinction of changing the defacto ownership of your device and data is the real inflection point.
So the ability to store child porn is what constitutes "de facto ownership" in your mind?
But why would they "use this tech to hunt down bitcoin owners"? They could just scan emails or photos directly. Doing it by way of neural hashes and vouchers seems like an absurdly complicated detour when they already own the OS and all the most commonly used apps.
>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?
In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.
It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.
What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.
Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are
A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)
B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).
> What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.
It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.
The concept of ownership you are asserting is but one of many historical principles of ownership. There are however, other concepts of ownership that conflict with what you are asserting.
I don't think there is a good faith argument that Apple is invading or controlling anything of you own. All that's happening is you agree to run the algorithm in exchange for using iCloud photos. That's just a contract; a mutual, voluntary exchange.
Apple is renting the phone to you for $1000 down and $0 a month (unless you actually are financing). Therefore, they are the landlord and, given notice, can change the property as they feel fit.
I agree we are all doomed, but I don’t agree it has that much with digital native or not to do. My boomer grandparents, my gen x parents and my millennial self, we are all affected by this. And gen z (the first generation of digital natives), and whatever comes after gen z, is not to blame for that. Reducing it to a generational thing is silly.
I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.
The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.
Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.
The door has been opened for quite some time. What do you think spotlight is? It scans an indexes all your data.
What's prevented the government from saying "hey if you see Osama Bin Laden in a spotlight scan, you need to send us all that guys data."
The answer is, Apple can just say FU. And that's exactly what will happen here. In particular, the US DOJ needs to stay in Apple's good graces here and not be overly aggressive. If DOJ pulls any funny business, that's a pretty good reason for Apple to just say "OK, we're picking up our toys and going home. You get nothing now and we're turning on E2EE."
This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".
How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?
It’s not a non sequitur. The comment is engaging with a series of rhetorical questions that imagine a slippery slope by observing that very little has changed about the trust model between iPhone users and their devices. If you are convinced Apple is slipping, then it is worthwhile to be able to answer how their position today is different than it was last month. That is of course a different question than whether their position last month was acceptable, and maybe people are realizing it was not.
As a concrete example, if you think the proposal introduces new technical risks, then if Apple announces they made a mistake and will instead scan entirely on the server, you may be satisfied. However, I’d argue that since no new technical risk has been introduced, your conclusions should not change.
I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control has shifted the Overton window more than what was actually proposed. Politicians who are none the wiser probably believe that’s what Apple actually built, even though it’s not.
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Simple: Money.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).
That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.
So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".
But now that #1 occurred, it will normalize this nonsense and pave the way for #2.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
Government does not care one bit about how much it costs or if it is even possible. They demand the data with an ultimatum: deliver it as we requested by our deadline or we send in our IT people to take it. Sorry (not) if it takes your whole company down while we plugin in our own servers in your datacenter to take your data.
Their response to such demands has not been we are technically incapable of doing what’s requested. The demand from the FBI in the San Bernardino case was a very small change to passcode retry constants, because the terrorist’s device did not have a Secure Element.
I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.
The reason we're focused on the state of it now is that we can switch at any time - especially if those barriers are shown to be ineffective or are removed at some point.
There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.
But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.
> They could have already scanned our files because they already have full control over the entire ecosystem.
Apple barely submits any CSAM[0]:
> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.
That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.
…in the same way any existing feature of iOS that makes device data available to Apple (eg iCloud Backup) “owns” the endpoint, no? What’s to stop a malicious Apple from turning on iCloud Backup for all its users and hoovering up your Signal messages database and iCloud Keychain?
> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.
With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.
> no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
This seems like the easiest thing out of the lot to verify.
The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.
If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.
None of these seem particularly easy to disguise.
Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.
Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.
This is a very well-written post. Ever since this program has been announced I have struggled with talking about the implications succinctly.
Online, I never know if an interlocutor is even arguing in good faith, but even in person it's difficult to balance talking about all the ways that the claimed safeguards are meaningless, how the benefits don't really make sense, how this is markedly different from other infringements on privacy with the need to be concise and explain that the real problems aren't just theoretical because similar invasions of privacy are killing actual people around the world already.
Anyway, I think the only practical way that this could resolve well, is if Apple saw a precipitous decline in its iCloud brand, then it could be argued that they had to abandon this plan for purely business reasons. A serious movement to abandon Apple services ($17.5B revenue in 2021 q3), might empower the people within Apple who opposed this reckless plan from the beginning.
Australia has already shown what the end-game is, with its "The Assistance and Access Act 2018" [1]. It's not illegal to have end-to-end encryption, but it's illegal to deny access to the ends of the encrypted pipe.
As an aside, Australia has just implemented the next step: the "Surveillance Legislation Amendment (Identify and Disrupt) Bill 2021" [2], which makes it legal to hack your device to access the ends of the pipe. Useful if the ends of the pipe are not controlled by a malleable corporation.
This actually sounds like the right way to go. Individual, warrant based access is comparable to wiretapping in a way that Apple's dragnet approach is not.
The media drastically overreacted to that act, to the point where the Department of Home Affairs now has an entire page dedicated to addressing the false reporting [0].
The TL;DR is that the act doesn't allow the government to introduce mass surveillance. Section 317ZG [1] expressly forbids any law enforcement request from _having the effect_ of introducing any systemic vulnerability or weakness and _explicitly_ calls out new decryption capabilities as under that umbrella. Your claim that a company can't deny access to the ends of an e2e-encrypted pipe is false.
And yes, that new act exists. The government will be able to hack into your devices and take over your accounts _with a warrant_, just like they can break into your house or take money from your bank account _with a warrant_.
> Apple devices might not be precisely the smartest purchase if the concept of your hardware is important to you.
Maybe a handful of HN users are aware of that, but the majority of users think that their property belongs to them.
It also goes against what Apple marketing says about privacy and your data. I wouldn't fault most consumers for not understanding that Apple's PR doesn't reflect reality.
I have a modest suggestion in the spirit of Apple's move.
As we know, there are people in the world who are running meth labs or creating explosives for terrorists in their homes. In order to safeguard the public, we shall have a detachment of dogs which will sniff everyone's houses every once in a while. When they sense something bad they'll alert their handlers and there'll be a manual inspection before reporting to police.
There's no risk to privacy here - dogs being dogs can't tell their handlers what they sense. We can also show the training publicly so people can verify the iDogs are trained to only sense drugs or explosives. So it's all even more secure than Apple's iPhone scanning! What says you?
I understand you're making a reductio ad absurdum argument here, but this is actually very similar to what LEO often tries to do today (e.g. searches based on what is smelled / seen inside your car at a traffic stop) and actually iDog might be constitutional.
The constitutional standard for a warrant search is "probable cause", and for a warrantless search you generally also need exigent circumstances. Assuming that a judge is sufficiently satisfied with the iDog's nose, and the iDog was sniffing somewhere public like the sidewalk when it found the meth smell, you could likely establish both probable cause (iDog smells meth) and exigent circumstances (meth labs often blow up, meaning there's emergent danger that cannot risk waiting for a warrant).
That's not to excuse Apple, just to provide a fun backstory on the things law enforcement gets to do in this country.
Another one that was nearly deemed constitutional: in Kyllo v United States, LEOs used thermal imaging to find an Oregon man's house was radiating a high amount of heat indicative of intense grow lights, which they used as probable cause to search the home for an illegal pot growing operation. This was only found unconstitutional by a 5-4 decision in the supreme court. If it were found constitutional, you can imagine we'd have helicopters flying overhead thermal imaging for pot operations today.
That said, I do feel you miss the genius of the iDog proposal. As far as I understand, an officer might sometimes be able to use his dog's nose if it happens during a procedure (which might include the dog searching if there's a warrant), but he can't create the circumstances deliberately. 'I was doing something proper and then the dog started jumping' might be admissible, but if an officer started walking the dog around hoping to catch people opinion might be different.
We suggest regularly scanning every household in the nation in a deliberate process. I was just proofreading five different papers proving the system is perfect if we can trust the dogs (of course we can, only monsters and terrorists don't trust dogs).
Don't worry, these are well-trained, well-bred and very well-fed iDogs. 'Not eating when not fed by the handler' is part of the basic training. Also, there's a manual verification step where the handlers search your property before reporting to the police.
The chances of an error are less than a billion to one. It's worth it to beat the drug dealers and terrorists.
The article mentions the slippery slope and "what happens in a year or two when..." scenarios. The article even calls it a cliff. But doesn't expand on timelines of concern.
As it currently stands, this concept would be sitting in plain sight waiting eternally for any lawmaker anywhere.
In your country, either side of the political spectrum - with a majority in lawmaking - can simply tap Apple on the shoulder and potentially turn ALL those devices against you.
Guns won't help. When information technology is used against you, when you are separated from society in a manner where people dare not risk their own livelihood for fear of being similarly marked.
And if Apple goes ahead with this, this risk is sitting there for the rest of your life just waiting for [that one politician that represents everything you hate] to use it against you.
Maybe that politician hasn't been born yet. But they will come. Don't let this Pandora's Box sit waiting for them.
It's the same reason free speech is something akin to sacred even for your worst enemies, because those who start taking away the bad people's speech are themselves always going to be one political actor away from having theirs taken away.
For the privacy-minded Apple users among us (I mean, that's who they marketed to, yeah?), I'd recommend turning off automatic software updates... For as long as it makes sense to. I hope they reverse their decision, but I'm already looking for alternatives. I'm certainly not buying another Apple device, even though I'm about due.
They really lost a lot of fans with this, myself included.
In the essay the "i" in the headline is lower case, which is significant and chilling. It's a homunculus of Apple's new direction: the meaning changed from "me" to "panopticon".
I don't know... I have a really hard time getting too upset about this. I'm a big proponent of privacy and have always been a Snowden supporter. And while "protecting children" is a trope in politics, I think everyone with an iPhone knows they're giving up some privacy to own one. It's constantly tracking their location and sending other data to Apple.
This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.
I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.
Maybe I'm completely paranoid here, but given that actual sex offenders commonly seek out ways to be near children, what happens if one or more of them end up in Apple's image vetting team?
They'd be completely anonymous and fully covered, with an endless pipeline of naked kids images being delivered to them.
The idea that if you take a picture of your kid in the bath, it just happens to match a CSAM fingerprint and then gets silently transmitted to anonymous reviewers for "review" is terrifying.
This is a disgusting thought, but hear me out. Perhaps this might actually be a good job to give to a paedophile. Their classifications would probably have a superior false positive rate than someone who is disgusted by the images, and it would all but eliminate any concern about an employee suffering psychological trauma.
Your terrifying idea mischaracterizes the nature of false positives. Any photo in your library is equally liable to be a false positive as any other; the perceptual hash is not looking for similar images by the metric of what you find similar (content). That’s also the underlying idea behind why people have been able to turn arbitrary images into adversarial false positives.
"just making sure the files you upload to them aren't illegal"
the problem lies in this sentence. 1) this happens on device before they're uploaded, which is a monumental shift for a company that claims to be pro-privacy 2) they're now saying they're willing to surveil photos for governments, the reason is sorta irrelevant. they're opening pandora's box - are they going to start scanning files on behalf of the RIAA or other copyright stuff now?
I'm curious could you unpack what this means for me? "big proponent of privacy and have always been a Snowden supporter"
Given that, my assumption is this would click for you, but as you said it doesn't. What does being a big proponent mean to you? How do you support Snowden? What's important to you about privacy?
Curious to hear your logic, I bet there are tons of people who have the same concerns (or lack of).
I stopped work on a memo app. Was piggybacking on Apple's branding around privacy. Am going to wait a year to see how this shakes out. Super disappointed.
IMHO this is really another continuation of the "you will own nothing and be happy" trend that has been around for a while, but companies have started to really push in the last few years. Slowly eroding ownership and normalising mass surveillance is their goal, so they can continue to extract more $$$ out of you.
When Apple sells yet again another record amount of iPhones next quarter which device should we move to?
Fundamentally this illustrates that software has become too inherently intrusive. What’s the solution tho that could ever be mainstream?
The other issue is that software has become too complicated and too many (potentially) bad things are happening in the background. How can the layperson fight back?
GrapheneOS on Pixel 4 has been a dream, as has Elementary OS on an XPS 13 is similarly great.
WebUSB installer for graphene is a game changer, it made the process incredibly easy. Also it seems most apps work fine without Play services.
Synology Photos is a great local icloud photos replacement.
I made the switch this past weekend. Aside from the impact on my wallet and hassle of needing to sell my Apple equipment, it was surprisingly painless.
I think this kind of defeatism really feeds the public's lax attitude toward privacy.
Yes, the iPhone and Apple products are very popular. And they will probably continue to grow. Does that mean we just accept anything they do, antithetical to one of their core promises to their customers?
Or do we make a big deal about it so everyone sees what's happening and what the implications are?
Do you mean that both the state and corporations will make you a generous gift of privacy? Right when it goes against their best interest of grabbing more power and profit?
Nope.
If you want privacy, like any other rights, you will need to fight for them. Rights can only be gained by a fight; whatever is given is a privilege, which is often taken back as easily as is granted.
So be prepared to (continue to) fight for your rights: in courts, in Congress, etc, but also by choosing less convenient, less featureful, more expensive devices and software which does not violate the rights you care about. And no, the majority of the consumers won't care until you show some signs of winning.
i actually had an ipad picked out and in the basket ready to buy it! although i have been thinking of buying one for a good few months now, ive just been hesitating a lot because i was not sure if i could handle how restrictive ios is compared to android.
the thing that really has me torn is that there is nothing else like it for simole/fun/creative music making apps (samplr, reason compact etc), which is the main reason i was going to buy one. at the same time i don't want to be part of apple's figures next quarter
Simply not using a smartphone is fine, they aren't that great and there doesn't need to be an alternative.
it feels like we've been trained as consumers to the point where saying no isn't realistic anymore, there must be something else to buy that represents me more.. etc
I think iPhone is superior and would hate to leave it. Although I barely do much with my phone outside 2FA, browsing, and texting. A switch won't be too bad in that regard.
I do love my Macbooks though. So while these privacy invasions make me angry I'm not willing to drop Apple all together. I've been meaning to keep most of my sensitive information on a usually disconnected Linux machine anyways. I'll keep using my Macbook for development.
I see this Apple move as a warning.
I have lived part of my life under communistic regime.
For me my Apple addiction ends here. There is no "magic" left in their products, only "bait & switch" dark patterns.
No hardware or UX will lure me again to suppress my instincts.
This is the beginning of global politically and financially motivated race for public control.
Apple is just giving a spark to the fire. Imagine a future in which your beloved Face ID will be tied to everything, your beloved iDevices, Teslas, or home appliances will scanning and reporting, scanning and reporting. There is no middle ground in this for me. No benefits or conveniences are so important. FOSS and public oversight of software must be demanded by law.
I don’t understand why this outrage seems so US-centric. This is the same Apple that hands over all your iCloud data (photos and otherwise) to the CCP if you happen to live in China. And they’ve done this openly for the last several years.
What am I missing? Isn’t that a much much much much worse thing for Apple to do? Why are we only suddenly suspicious of Apple’s privacy claims with this matter?
As an American I couldn’t care less that Chinese people’s data is handed over to Chinese people’s government, especially considering the alternative would be that Chinese people’s data is handed over to a US entity and by association the US government.
Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests. *
Considering this fact, it is pretty one-sighted to see this as some sort of unconceivable act. However if you look at it from the other side would we want all American user data (assuming Russia had a company that had such pervasive penetration into American lives as Apple does globally) to be sitting on Russian servers subject to arbitrary Russian laws?
So if you only consider American interests, it’s unconceivable for us to give up such power and control over other sovereign nations, but perhaps other countries don’t care about American interests like we do.
All Apple did in China was comply with local laws to stay in business there. What Apple is doing in the US is not mandated by law (as far as I know).
From the American side Apple has marketed itself as privacy focused, even fighting the FBI publicly at the risk of negative publicity. This about face is unexpected but also betrays those of us who invested in the Apple product line under the expectation they continue this standard of privacy and security that was marketed. Chinese people probably never expected this level of privacy to begin with, but we did and we can.
This to me is a surprising attitude. As an American whose outlook is generally framed by American values, it’s very upsetting to think of how privacy and freedoms are systematically impinged upon in so many parts of the world. If we can be upset about invasion of privacy in one country, why would those principles change at geopolitical borders?
> Chinese people probably never expected this level of privacy to begin with, but we did and we can.
Taiwanese users data is also backed up to China. This was confirmed to me by an Apple Support in China. Whether you believe Taiwan is part of China or not, I can assure you users in Taiwan do expect this level to privacy.
What we are seeing is a slow deterioration of user privacy across the Apple ecosystem, not just the US. So even if you don’t care about Chinese users data, it does show what Apple management as a whole thinks of your data.
> Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests.
The most seem to forget, that with this newcoming feature this is not possible anymore. Apple can’t decrypt your images anymore by request. (Read Apple’s PSI system)
There is also strong evidence that same is coming for backups. On iOS 15 beta, there is backup recovery option by authentication key.
Apple didn’t wake up one day and decide to do this on a lark.
They’re being proactive, probably in a minimalist form, to anticipate regulatory powers on what is unarguably the largest or second largest platform used for illegal porn.
If FB screeners have ptsd and are killing themselves over what they have to see every day, imagine what is on iCloud and iPhones. Right now, nobody is required to filter that content while social media is. The alternative to “sure, you tell us what is illegal and we’ll scan for it” is “We’re the govt and we want to see everyones photos for the children.”
Sure, the latter may still happen, but probably later than sooner now. I’m surprised it has taken this long,
It's my understanding that Apple simply can't operate in China without playing by those rules. So really, the onus is on the CCP.
In this case, is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.
I agree the question of whether Apple was compelled by whatever government (or if they did this voluntarily) has implications on the ethics of these decisions. They may genuinely have no choice.
But I don’t see how it affects the question of whether Apple’s privacy assertions are trustworthy.
> is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.
You're being downvoted but it's a critical issue.
If Apple is currently being compelled to do this, it likely means the US Government has a massive new privacy obliterating program underway and Apple probably isn't the only tech giant joining the human rights violation parade. It's important to find out if that's going on. We can be certain they didn't stop with PRISM.
If it turns out to be the case, that Apple has joined up to another vast human rights violating program (they already did it at least once before, remember), the US needs to move forward toward Nuremberg-style trials for all involved Apple management and all involved Apple employees (and not only them). That's the only way it stops.
Such human rights violations should not be allowed to continue. How many tech employees at these companies got away with extraordinary human rights violations related to PRISM? Employees at these companies were responsible in part and critical to helping to make it happen. Who are these enablers? Why aren't they in prison? Why is this so rarely discussed on HN? (yeah we all know why)
HN is pretty amusing about this topic. Privacy is a human right? Yeah? Also universally HN: but let's not talk about the people actually responsible for the human rights violations; let's not talk about all the techies being paid princely sums to commit human rights atrocities. Let's not talk about prison sentences for what they've done to their fellow humans. Let's not hold tech employees responsible.
I imagine because most of us care more about US policies in general than Chinese because most of us live in the US. If fixing Chinese lack of free communication were on the table I'm sure we'd mostly be for it, but that's a whole other thing that ultimately goes back to their government.
I mean, that’s also bad? But the CCP is not going to budge on this, and it doesn’t affect me as much as an American, so I feel like I can be upset about both and more upset about the one that affects me directly.
As far as is documented, the behavior of iCloud does not change, just the operator. In particular, the difference is that end to end encrypted data in iCloud remains that way, so saying all iCloud data is handed over is incorrect.
In fact, iMessage is the only end to end encrypted messaging service operating in the country (for example).
It’s my understanding that the keys used in that “end-to-end” encryption are also under the control of the operator [1], so from a privacy perspective it is the same as handing over that data in plaintext.
> the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.
A very overdramatic sentence.
It is a bit scary to realise that only now people think that this border is being crossed. It has happened a very long time ago already.
The first years of Android, owners were the product, not the phone. Privacy features in the past years might have improved this a little.
Google’s massive success on many services is based on the fact how phones and their software were collecting data for them. User interfaces are just illusions for non-tech persons. They might give you a sense of control.
Now that Apple does not trust us with CSAM material, the end is near. There are arguments for both sides, and many are taking sides to just get attention.
However, you can only solve this problem with politics.
Privacy advocates need to be like second amendment activists. We need to use their playbook. They raise a big stink about anything, no matter who big or small, that could curtail their rights. No number of Sandy Hook events will result in meaningful changes in laws.
Pushing everyone to Linux will eventually lead to all hardware falling under some national security law, allowing hardware to be imported if only they allow certain OSs to be installed on them and boot loaders will be locked.
Free market has no impact here, the masses don't care. And privacy supporters are too logical to whip up any type of movement.
Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.
> Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.
They have already come up with good reasons.
"Every time you use encryption, you're protecting someone who needs to use it to stay alive." -- Bruce Schneier
"“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” -- Edward Snowden
Surveillance harms journalism and activism, making the government too powerful and not accountable. If only activists and journalists will try to have the privacy, it will be much easier to target them. Everyone should have privacy to protect them. It’s sort of like freedom of speech is necessary not just for journalists, but for everyone, even if you have nothing to say.
> Pushing everyone to Linux will eventually lead to all hardware falling under some national security law
I don't see any connection here. Linux is already used on all servers and nothing happens.
I haven't thought the day will come when Apple news brings up very strongly, as a knee jerk reaction, one short part of a one long sentence Hungarian poem indeed called One Sentence On Tyranny. It is a poem many of us know from heart, to remember what was even if many back in Hungary forgot. I fail to convey my emotions here in just a few words but I am incredibly saddened.
Anyways, I checked a few translations, they lose some of the power of the original but let me try, the first two lines only for context:
[...]
you would like to look, but you can only see
what Tyranny conjured up for you
already forest fire surrounds you
fanned into flame from a matchstick
you threw down without stamping it out
Oh yes, nothing new under the Sun. And it might be too late now.
These things have been so much on my mind because I saw an anti masker protest in Vancouver peacefully escorted by police. My mind melted. I remember, remember all too well, it was only 35 years ago when Hungarian police have broken up a protests with batons -- it's called the Battle Of the Elizabeth Bridge to this day. It was a very one sided battle, mind you. And while I don't remember, my parents do when they broke it up with tanks....
I wonder if any current or future President might revisit the idea of granting Snowden a pardon. He is is still viewed unfavorably by a fair amount of the US population, but it seems like that's changing with time.
Around here, people view him favorably but are wary because his choices make him look like an agent for Russia. I think it would change only after a pardon.
So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud?
Isn't this essentially what might happen with so-called ChatControl in Europe?
My question after reading this article is, what can I do as a current iPhone user? Is there a reasonable alternative phone out there that can run the same apps but that isn't effectively controlled by Apple or Google?
Simple: don't store your photos on Apple's property (iCloud). That's the only time this fingerprinting will affect you. If you store photos locally, then no type of fingerprinting will happen.
While you're at it, don't store them on anyone's servers, because they all fingerprint for the same exact reasons. There's no service out there that doesn't do this at some level.
Sorry, I should have been more clear in my post. I meant to say that, if I no longer want to support Apple (or Google) because of their behavior, are there any reasonable smartphone options available? It seems to me that this is the only way I can protest against what Apple is doing.
What would happen if in a new paradigm shift multi-nationals decided to use their enormous lobbying power to push back on the government in favor of their users for once instead of only lobbying to screw them over?
Great to see someone influential framing it this way. At the end of the day, what it's there for doesn't matter and unfortunately way too many of us stumble on this mistake in reasoning.
I think your best bet might be NextCloud. You can either host it yourself on a storage VPS or find a provider who provides managed NextCloud (or however they repackage it.) Be aware there are many more providers than listed on the NextCloud website and you'd do well to search about. If you're in Europe, take a look at Hetzner's "Storage Share" as an example of what's possible and prices for high-quality hosting, but like I said there's a million operators out there doing this and you can get it dirt cheap if you want.
I don't know how it works on an iPhone but on Android NextCloud detects when save pictures in a different directory due to some app and asks me if I want it to track that directory. It was already configured by default to track stuff taken with the default camera app.
Synology disk station software does this, but that's a significant investment. If you open the DS file software it defaults to syncing your photos. I don't run mine public, so I've never checked how their VPN/cloud sync thing works.
If I want to share a photo from my phone I use syncthing hosted on one of my VMs on a server I bought and built, but that I don't have physical access to easily (I'll never see it, probably). At home to share a photo I either use mattermost to get a public link to the stored image or ssh to the same box as syncthing runs on. I also host mattermost, on a different VM on a different server in the same datacenter.
I don't like apps seeing my stuff so I just don't use stuff like imgur or whatever.
Edward Snowden just went full Stallman. Take a good idea and ride it over the cliff of sanity.
Of course privacy can't be absolute, we live in a society. Be realistic. Focus on evil things. If you think Apple have an evil plan, sure, but most people who object don't even think that.
And for a tech spy he seems to not understand tech. "What if some evil regime wants Apple to scan for anti-government propaganda". Well then they won't be using the CSAM system that's for sure, they can just scan the images directly, either on iCloud or on device. Co-opting the CSAM scanner is probably the most impractical way imaginable to spy on Uighur separatists.
I'm not saying you should "trust" me, or anyone. Just consider the facts:
A)
- Apple have complete control over the hardware, the OS and all the most popular apps, including Photos and Mail.
- They also have complete control over iCloud, which is not encrypted
- They can and do scan your photos and emails so that they can classify photos, find possible appointments, emails etc, and now they even OCR your photos.
B)
- They are now building a very high profile, limited and locked in system that relies on hashes, external databases, a large number of matches, human review etc.
Do you really think they would use B if they, the FBI, the Chinese government or whoever, would want to spy on users? For all we know they are already spying, it would be completely trivial to do so. Clearly system B is a complete red herring when it comes to spying. They don't need it.
IBM gleefully cooperated with the Nazis. It won’t be long until Apple is using this framework to alert the PRC about Chinese dissidents. Just to merely stay in their market.
iOS 15 effectively is the point where Apple kicks off the holocaust that they’re going to be responsible for. I hope they enjoy their place in history because they’re earning it. I’ve loved my iPhones, but there’s a warm place in hell waiting for all Apple employees involved in this endeavor.
Apple is an enemy more threatening to mankind’s freedoms than Al Qaeda ever could’ve been.
How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?
George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.
We are the slow boiled frog as the most expansive totilitarian infrastructure in history is built up.
> How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?
> George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.
Because we're not living in 1984's dystopia where the government oppresses us, we're living in Brave New World's dystopia where we choose to oppress ourselves.
Apple has through side channels leaked iCloud is the largest open host of CSAM among big tech. It's the only large provider that hosts images that doesn't automatically scan. The only difference is Apple wants to do it while leaving your photos in the cloud encrypted. This isn't rational, it's an anti-Apple culture war position.
This entire argument is based on the premise that Apple only scans photos that the user has requested to be uploaded to iCloud, and will continue to do so.
I don't think many people believe that anymore. Not even the Apple's goodwill, necessarily - but that, once the system is in place and normalized, the governments won't mandate it and extend its scope by legislative fiat.
Are the photos in iCloud actually encrypted, though? As far as I’m aware, a government agency can subpoena them already. I’m not sure I agree with the slippery-slope argument, but I’m still failing to see how Apple’s current security model prevents them from performing the hashing on iCloud servers and avoiding all this drama.
This distinction of changing the defacto ownership of your device and data is the real inflection point. The surveillance technology itself is not really that novel, as functionally it's applying established anti-virus techniques to data instead of code. Ask any AV company how their detection works, and it will include a variation on this.
This same tech can (and will likely) be used to find the owners of bitcoin and other cryptocurrency wallets, honeypot tokens, community identities, and to provide profiling information to the company's political masters. The collisions in the hashing scheme mean that you can insert anything you want onto peoples devices and get them pulled into the legal system once it is flagged, where the process itself is the punishment. The whole scheme is too stupid to ever have been about reason, it's just pretexts and narrative, and this is as good a time as any to exit their ecosystem.
Apple really picked the wrong time to attempt this, as I do not see anyone who understands how evil this is ever forgiving them. The most charitable thing I can say about it is that they're probably just doing it as part of a deal to avoid anti-trust plays, where Apple plays ball with the feds and its parties, and the storm just magically passes them over. The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
> The surveillance technology itself is not really that novel
What's novel is that the tech reports you to the authorities. Imagine your AV reporting you to the authorities for digital piracy, it's something that RIAA could only dream of back in the day. Now it's becoming a reality.
"We just scan every song on your iPod to make sure the neural hash isn't copyrighted content. If it is blah blah blah tokens blah blah report you to the RIAA."
Just bought a System76 laptop. Happy birthday Linux!
24 replies →
Not the authorities perhaps, but it is already happening in some degree: Windows Defender sending “samples” or unknown binaries to the cloud to analyze them. I am extremely bothered by this and try to disable this “feature” as much as possible. As always with Windows, you have to aggressively tweak system settings to permanently disable the constant reminders “Oops we have detected suboptimal settings, please turn on every privacy invading feature for your convenience”. We can only hope MS doesn’t abuse the samples for other purposes, but given their and other big techs track record, we can assume there are additional parties interested in the submissions.
3 replies →
> I do not see anyone who understands how evil this is ever forgiving them.
I'm not so optimistic.
How many people among Apple's users actually understand how evil this is, and among those, how many do really, actually care? People seem fine enough with Facebook's data vacuuming, why would they protest against Apple's "non-intrusive" scheme? They "don't hate children" and, of course, "have nothing to hide".
The issue, as has been brought up in one form or another in the numerous threads on the subject, is that people like their comfort (using smartphones) and there really isn't that much of a choice.
> The good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
And therein lies the rub. Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you.
> Many people wouldn't find doing this fun. They'd much prefer being able to watch Netflix in ultra high-def and not having to futz around with Nvidia's drivers or what have you
I think they were being sarcastic, maybe not. Either way, you're right. And this is why WE need to be doing this so that it becomes a viable option.
10 replies →
I feel it's bit unreasonable to expect a non-technical user to even start to comprehend this issue.
Many of them know that the photos, videos, music from their old iPhone would be available in their new iPhone after they sign-in; But do they really understand what happened in-between to enable that, Should they even know that? That's what Apple is banking on.
It would be pragmatic to expect even technically equipped Apple fans to call-out Apple's latest hypocrisy and move away from the ecosystem. They didn't do it earlier, They didn't do it when it came to light that Apple knew that its contractors exploited child labor[1], They wouldn't do it now.
[1] https://www.businessinsider.in/tech/news/apple-knew-a-suppli...
Well you could categorise them as not understand how evil this is. They are not comfortable, but they may not see it as evil.
"get them pulled into the legal system once it is flagged, where the process itself is the punishment"
This is the real threat here. Anyone can have data flagged at any time, by accident or maliciously. Like how any video can be flagged for copyright infringement and the creator is 'punished by the process' regardless of guilt/innocence. A possible fix would be to have severe financial punishments for every false claim (lets say a million bucks per instance). Imagine how careful the system would be designed if that were the case, verses the case where there is no punishment for false claims.
Supporting and advocating for “Right to Repair” laws was never more crucial, because this is not going to stop at iPhone or mac.
Just to extend my comment in response to how this problem will spread:
By doing surveillance on types of images, Apple is in effect implementing anti-virus - for ideas. That's only a bit hyperbolic, as the perceptual hash for a viral meme can be searched on, just like the material they're using as a pretext for it.
I could even see them announcing it at a launch. We should be concerned that the company has skipped its Black Mirror stage and jumped right into its Universal Paperclips endgame.
(I'm also appreciating the irony that people like me being angry about Apple announcing they're going to implement a version of what Google has already been technically able to do for the last decade, and what Microsoft has probably been doing in secret since even before then.)
8 replies →
> This distinction of changing the defacto ownership of your device and data is the real inflection point.
So the ability to store child porn is what constitutes "de facto ownership" in your mind?
But why would they "use this tech to hunt down bitcoin owners"? They could just scan emails or photos directly. Doing it by way of neural hashes and vouchers seems like an absurdly complicated detour when they already own the OS and all the most commonly used apps.
De facto ownership means your personal property doesn’t get searched for evidence of crime, without reasonable cause
1 reply →
You will really own nothing and you will be “happy”. Soma here. Soma there. Soma, soma everywhere.
> he good news is running OSX made me lazy, and getting back into running a linux or freebsd laptop again is going to be fun.
This is good news.
>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?
In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.
It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.
What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.
Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are
A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)
B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).
[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...
[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...
> What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.
Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.
12 replies →
It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.
Like I said, Apple controls the hardware, software, and services. They are already control your property.
3 replies →
The concept of ownership you are asserting is but one of many historical principles of ownership. There are however, other concepts of ownership that conflict with what you are asserting.
https://www.econtalk.org/michael-heller-and-james-salzman-on...
I don't think there is a good faith argument that Apple is invading or controlling anything of you own. All that's happening is you agree to run the algorithm in exchange for using iCloud photos. That's just a contract; a mutual, voluntary exchange.
7 replies →
Apple is renting the phone to you for $1000 down and $0 a month (unless you actually are financing). Therefore, they are the landlord and, given notice, can change the property as they feel fit.
13 replies →
I agree we are all doomed, but I don’t agree it has that much with digital native or not to do. My boomer grandparents, my gen x parents and my millennial self, we are all affected by this. And gen z (the first generation of digital natives), and whatever comes after gen z, is not to blame for that. Reducing it to a generational thing is silly.
1 reply →
I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.
The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.
This is very well put.
Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.
It’s just how life and politics work.
The door has been opened for quite some time. What do you think spotlight is? It scans an indexes all your data.
What's prevented the government from saying "hey if you see Osama Bin Laden in a spotlight scan, you need to send us all that guys data."
The answer is, Apple can just say FU. And that's exactly what will happen here. In particular, the US DOJ needs to stay in Apple's good graces here and not be overly aggressive. If DOJ pulls any funny business, that's a pretty good reason for Apple to just say "OK, we're picking up our toys and going home. You get nothing now and we're turning on E2EE."
9 replies →
This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".
How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?
It’s not a non sequitur. The comment is engaging with a series of rhetorical questions that imagine a slippery slope by observing that very little has changed about the trust model between iPhone users and their devices. If you are convinced Apple is slipping, then it is worthwhile to be able to answer how their position today is different than it was last month. That is of course a different question than whether their position last month was acceptable, and maybe people are realizing it was not.
As a concrete example, if you think the proposal introduces new technical risks, then if Apple announces they made a mistake and will instead scan entirely on the server, you may be satisfied. However, I’d argue that since no new technical risk has been introduced, your conclusions should not change.
I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control has shifted the Overton window more than what was actually proposed. Politicians who are none the wiser probably believe that’s what Apple actually built, even though it’s not.
2 replies →
What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Simple: Money.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).
That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.
So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".
But now that #1 occurred, it will normalize this nonsense and pave the way for #2.
Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".
No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.
Government does not care one bit about how much it costs or if it is even possible. They demand the data with an ultimatum: deliver it as we requested by our deadline or we send in our IT people to take it. Sorry (not) if it takes your whole company down while we plugin in our own servers in your datacenter to take your data.
1 reply →
Their response to such demands has not been we are technically incapable of doing what’s requested. The demand from the FBI in the San Bernardino case was a very small change to passcode retry constants, because the terrorist’s device did not have a Secure Element.
3 replies →
The politics of it is very different, and that's where the danger lies:
https://news.ycombinator.com/item?id=28239506
I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.
The reason we're focused on the state of it now is that we can switch at any time - especially if those barriers are shown to be ineffective or are removed at some point.
3 replies →
There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.
But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.
Buried in the EULA, you give consent.
Why would Apple start shrugging now when they've been fighting the FBI in court?
7 replies →
> They could have already scanned our files because they already have full control over the entire ecosystem
They did do it in emails since 2019: https://www.indiatoday.in/technology/news/story/apple-has-be...
> They could have already scanned our files because they already have full control over the entire ecosystem.
Apple barely submits any CSAM[0]:
> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.
0: https://www.hackerfactor.com/blog/index.php?/archives/929-On...
At one point this will be proven and we'll go back to regular digital cameras or even polaroids.
The existing law and user agreement also sets your rights.
Apple might have done what you say last month, illegally.
Now it's in the user agreement and they can do it legally, at scale.
This literally creates a precedent.
Nothing except Apple saying you could trust them. People were stupid enough to accept that and now even the trust is gone.
That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.
…in the same way any existing feature of iOS that makes device data available to Apple (eg iCloud Backup) “owns” the endpoint, no? What’s to stop a malicious Apple from turning on iCloud Backup for all its users and hoovering up your Signal messages database and iCloud Keychain?
1 reply →
> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.
With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.
> no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
This seems like the easiest thing out of the lot to verify.
The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.
If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.
None of these seem particularly easy to disguise.
Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.
Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.
How do you think Apple will increase the scope of what’s scanned without every person with Ghidra skills not noticing?
1 reply →
This is a very well-written post. Ever since this program has been announced I have struggled with talking about the implications succinctly.
Online, I never know if an interlocutor is even arguing in good faith, but even in person it's difficult to balance talking about all the ways that the claimed safeguards are meaningless, how the benefits don't really make sense, how this is markedly different from other infringements on privacy with the need to be concise and explain that the real problems aren't just theoretical because similar invasions of privacy are killing actual people around the world already.
Anyway, I think the only practical way that this could resolve well, is if Apple saw a precipitous decline in its iCloud brand, then it could be argued that they had to abandon this plan for purely business reasons. A serious movement to abandon Apple services ($17.5B revenue in 2021 q3), might empower the people within Apple who opposed this reckless plan from the beginning.
i still think its more about dmca2.0 than any save the chiodren bullshit
Today it’s CP, next year it will be copyright violations…
2 replies →
Australia has already shown what the end-game is, with its "The Assistance and Access Act 2018" [1]. It's not illegal to have end-to-end encryption, but it's illegal to deny access to the ends of the encrypted pipe.
As an aside, Australia has just implemented the next step: the "Surveillance Legislation Amendment (Identify and Disrupt) Bill 2021" [2], which makes it legal to hack your device to access the ends of the pipe. Useful if the ends of the pipe are not controlled by a malleable corporation.
[1] https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...
[2] https://www.aph.gov.au/Parliamentary_Business/Bills_Legislat...
This actually sounds like the right way to go. Individual, warrant based access is comparable to wiretapping in a way that Apple's dragnet approach is not.
Don't forget the #datbill as well. [1]
Australia is a bad joke at this point. I thought nanny state was bad but we're firmly moving into Stasi territory now.
Even the government's obvious incompetence is looking like not enough protection in the face of all these overreaches.
[1] https://mobile.twitter.com/efa_oz/status/1430674903548661767
This is so disgusting.
Ah, this old bogeyman.
The media drastically overreacted to that act, to the point where the Department of Home Affairs now has an entire page dedicated to addressing the false reporting [0].
The TL;DR is that the act doesn't allow the government to introduce mass surveillance. Section 317ZG [1] expressly forbids any law enforcement request from _having the effect_ of introducing any systemic vulnerability or weakness and _explicitly_ calls out new decryption capabilities as under that umbrella. Your claim that a company can't deny access to the ends of an e2e-encrypted pipe is false.
And yes, that new act exists. The government will be able to hack into your devices and take over your accounts _with a warrant_, just like they can break into your house or take money from your bank account _with a warrant_.
[0]: https://www.homeaffairs.gov.au/about-us/our-portfolios/natio...
[1]: http://classic.austlii.edu.au/au/legis/cth/consol_act/ta1997...
This is a reduction in privacy. Do what you want on your servers, but hands off my phone.
Also the whole "Apple is planning to encrypt iCloud Photos end-to-end anyway" thing is just fanfiction. I'll believe it when they announce it.
> This is a reduction in privacy. Do what you want on your servers, but hands off my phone.
Apple devices might not be precisely the smartest purchase if the concept of your hardware is important to you.
> Apple devices might not be precisely the smartest purchase if the concept of your hardware is important to you.
Maybe a handful of HN users are aware of that, but the majority of users think that their property belongs to them.
It also goes against what Apple marketing says about privacy and your data. I wouldn't fault most consumers for not understanding that Apple's PR doesn't reflect reality.
2 replies →
As you are well aware, the market is not very competitive and there aren't dozens of vendors to pick from.
"Use something else" (or even more laughably, "start your own") is not a reasonable argument anymore.
9 replies →
I have a modest suggestion in the spirit of Apple's move.
As we know, there are people in the world who are running meth labs or creating explosives for terrorists in their homes. In order to safeguard the public, we shall have a detachment of dogs which will sniff everyone's houses every once in a while. When they sense something bad they'll alert their handlers and there'll be a manual inspection before reporting to police.
There's no risk to privacy here - dogs being dogs can't tell their handlers what they sense. We can also show the training publicly so people can verify the iDogs are trained to only sense drugs or explosives. So it's all even more secure than Apple's iPhone scanning! What says you?
I understand you're making a reductio ad absurdum argument here, but this is actually very similar to what LEO often tries to do today (e.g. searches based on what is smelled / seen inside your car at a traffic stop) and actually iDog might be constitutional.
The constitutional standard for a warrant search is "probable cause", and for a warrantless search you generally also need exigent circumstances. Assuming that a judge is sufficiently satisfied with the iDog's nose, and the iDog was sniffing somewhere public like the sidewalk when it found the meth smell, you could likely establish both probable cause (iDog smells meth) and exigent circumstances (meth labs often blow up, meaning there's emergent danger that cannot risk waiting for a warrant).
That's not to excuse Apple, just to provide a fun backstory on the things law enforcement gets to do in this country.
Another one that was nearly deemed constitutional: in Kyllo v United States, LEOs used thermal imaging to find an Oregon man's house was radiating a high amount of heat indicative of intense grow lights, which they used as probable cause to search the home for an illegal pot growing operation. This was only found unconstitutional by a 5-4 decision in the supreme court. If it were found constitutional, you can imagine we'd have helicopters flying overhead thermal imaging for pot operations today.
Upvoted.
That said, I do feel you miss the genius of the iDog proposal. As far as I understand, an officer might sometimes be able to use his dog's nose if it happens during a procedure (which might include the dog searching if there's a warrant), but he can't create the circumstances deliberately. 'I was doing something proper and then the dog started jumping' might be admissible, but if an officer started walking the dog around hoping to catch people opinion might be different.
We suggest regularly scanning every household in the nation in a deliberate process. I was just proofreading five different papers proving the system is perfect if we can trust the dogs (of course we can, only monsters and terrorists don't trust dogs).
1 reply →
Agreed, I have nothing to hide & love dogs.
And hate terrorists!
damn, what if the dog gets excited and barks because I'm dry-aging some of my (fully legally hunted!) wild game?
or because the handler accidentally stepped on the dog's tail?
Don't worry, these are well-trained, well-bred and very well-fed iDogs. 'Not eating when not fed by the handler' is part of the basic training. Also, there's a manual verification step where the handlers search your property before reporting to the police.
The chances of an error are less than a billion to one. It's worth it to beat the drug dealers and terrorists.
That's why we have the manual human reviewers, you see.
The article mentions the slippery slope and "what happens in a year or two when..." scenarios. The article even calls it a cliff. But doesn't expand on timelines of concern.
As it currently stands, this concept would be sitting in plain sight waiting eternally for any lawmaker anywhere.
In your country, either side of the political spectrum - with a majority in lawmaking - can simply tap Apple on the shoulder and potentially turn ALL those devices against you.
Guns won't help. When information technology is used against you, when you are separated from society in a manner where people dare not risk their own livelihood for fear of being similarly marked.
And if Apple goes ahead with this, this risk is sitting there for the rest of your life just waiting for [that one politician that represents everything you hate] to use it against you.
Maybe that politician hasn't been born yet. But they will come. Don't let this Pandora's Box sit waiting for them.
It's the same reason free speech is something akin to sacred even for your worst enemies, because those who start taking away the bad people's speech are themselves always going to be one political actor away from having theirs taken away.
A powerful piece...
For the privacy-minded Apple users among us (I mean, that's who they marketed to, yeah?), I'd recommend turning off automatic software updates... For as long as it makes sense to. I hope they reverse their decision, but I'm already looking for alternatives. I'm certainly not buying another Apple device, even though I'm about due.
They really lost a lot of fans with this, myself included.
Yeah turned it off and iCloud too!
I was all in on Apple. Now got a System76 laptop on the way. Transitioning off iPhone to Linux will be tough but something new to explore.
In the essay the "i" in the headline is lower case, which is significant and chilling. It's a homunculus of Apple's new direction: the meaning changed from "me" to "panopticon".
Yes. Our title caser doesn't understand such nuances, but we've corrected it now.
Why stop at photos you take?
Since we're doing this on device we can just turn the camera on every few minutes and ask an ai if the camera sees something interesting.
If it sees that you're in trouble it can start streaming to the authorities.
We will finally be safe.
By the way, if your phone is off or left at home we will know you're in trouble and send assistance right away.
I wish I could say /s.
I don't know... I have a really hard time getting too upset about this. I'm a big proponent of privacy and have always been a Snowden supporter. And while "protecting children" is a trope in politics, I think everyone with an iPhone knows they're giving up some privacy to own one. It's constantly tracking their location and sending other data to Apple.
This isn't a government agency. Apple has been incredibly thoughtful about privacy in the past, and I feel like they've earned the benefit of the doubt here.
I hope I'm not wrong, but I don't see how this is insane. They're just making sure the files you upload to them aren't illegal.
Maybe I'm completely paranoid here, but given that actual sex offenders commonly seek out ways to be near children, what happens if one or more of them end up in Apple's image vetting team?
They'd be completely anonymous and fully covered, with an endless pipeline of naked kids images being delivered to them.
The idea that if you take a picture of your kid in the bath, it just happens to match a CSAM fingerprint and then gets silently transmitted to anonymous reviewers for "review" is terrifying.
This is a disgusting thought, but hear me out. Perhaps this might actually be a good job to give to a paedophile. Their classifications would probably have a superior false positive rate than someone who is disgusted by the images, and it would all but eliminate any concern about an employee suffering psychological trauma.
4 replies →
Your terrifying idea mischaracterizes the nature of false positives. Any photo in your library is equally liable to be a false positive as any other; the perceptual hash is not looking for similar images by the metric of what you find similar (content). That’s also the underlying idea behind why people have been able to turn arbitrary images into adversarial false positives.
11 replies →
They can do that on their servers, not on my phone.
"just making sure the files you upload to them aren't illegal"
the problem lies in this sentence. 1) this happens on device before they're uploaded, which is a monumental shift for a company that claims to be pro-privacy 2) they're now saying they're willing to surveil photos for governments, the reason is sorta irrelevant. they're opening pandora's box - are they going to start scanning files on behalf of the RIAA or other copyright stuff now?
I'm curious could you unpack what this means for me? "big proponent of privacy and have always been a Snowden supporter"
Given that, my assumption is this would click for you, but as you said it doesn't. What does being a big proponent mean to you? How do you support Snowden? What's important to you about privacy? Curious to hear your logic, I bet there are tons of people who have the same concerns (or lack of).
"just"
And who gets to decide what is illegal?
Laws?
5 replies →
Thank you for expressing this opinion. I know it’s not a popular one; but I’m 100% with you.
Time has proven otherwise. All censorship systems start with protecting the kids, and grow to eventually encompass all undesirable content…
How have Google’s, Facebook’s, or Microsoft’s CSAM scanning grown?
4 replies →
Found the person who didn't read the article.
The site guidelines ask you not to post like this. Could you please review and follow them? https://news.ycombinator.com/newsguidelines.html
Also, it would be good if you'd stop posting unsubstantive comments generally.
I read it. I just disagree with it.
I stopped work on a memo app. Was piggybacking on Apple's branding around privacy. Am going to wait a year to see how this shakes out. Super disappointed.
[Edit]
Here is/was the privacy statement https://www.deepmuse.com/privacy - I'm kinda embarrassed.
IMHO this is really another continuation of the "you will own nothing and be happy" trend that has been around for a while, but companies have started to really push in the last few years. Slowly eroding ownership and normalising mass surveillance is their goal, so they can continue to extract more $$$ out of you.
Be seeing you.[1]
[1] https://archive.org/details/ThePrisoner01Arrival
When Apple sells yet again another record amount of iPhones next quarter which device should we move to?
Fundamentally this illustrates that software has become too inherently intrusive. What’s the solution tho that could ever be mainstream?
The other issue is that software has become too complicated and too many (potentially) bad things are happening in the background. How can the layperson fight back?
GrapheneOS on Pixel 4 has been a dream, as has Elementary OS on an XPS 13 is similarly great.
WebUSB installer for graphene is a game changer, it made the process incredibly easy. Also it seems most apps work fine without Play services.
Synology Photos is a great local icloud photos replacement.
I made the switch this past weekend. Aside from the impact on my wallet and hassle of needing to sell my Apple equipment, it was surprisingly painless.
I think this kind of defeatism really feeds the public's lax attitude toward privacy.
Yes, the iPhone and Apple products are very popular. And they will probably continue to grow. Does that mean we just accept anything they do, antithetical to one of their core promises to their customers?
Or do we make a big deal about it so everyone sees what's happening and what the implications are?
I've been making a big stink about it. I switched to Signal for my iMessage buddies and that seems to be sticking.
The more people seen visibly taking proactive steps to create privacy, the better.
Do you mean that both the state and corporations will make you a generous gift of privacy? Right when it goes against their best interest of grabbing more power and profit?
Nope.
If you want privacy, like any other rights, you will need to fight for them. Rights can only be gained by a fight; whatever is given is a privilege, which is often taken back as easily as is granted.
So be prepared to (continue to) fight for your rights: in courts, in Congress, etc, but also by choosing less convenient, less featureful, more expensive devices and software which does not violate the rights you care about. And no, the majority of the consumers won't care until you show some signs of winning.
that's exactly correct. We have to accept discomfort NOW (to our convenience, wallets) so that everyone can benefit later.
We cannot give up.
I was very very close to ditching my Android device for an Apple device because it seemed like Apple was on the side of privacy.
I don't feel that way anymore, and watching the FOSS projects like the PinePhone with a lot of interest.
I made the switch from Android to Apple a few months ago for exactly this reason, and boy do I feel like I’ve been bait-and-switched.
2 replies →
i actually had an ipad picked out and in the basket ready to buy it! although i have been thinking of buying one for a good few months now, ive just been hesitating a lot because i was not sure if i could handle how restrictive ios is compared to android.
the thing that really has me torn is that there is nothing else like it for simole/fun/creative music making apps (samplr, reason compact etc), which is the main reason i was going to buy one. at the same time i don't want to be part of apple's figures next quarter
The solution is privacy-safe open standards that decouple corporations’ proprietary hold over phone network tech
Apple is the only device I know of doing on-device scanning.
On device scanning vs in the cloud scanning with a distinction without a difference.
5 replies →
> How can the layperson fight back?
Simply not using a smartphone is fine, they aren't that great and there doesn't need to be an alternative.
it feels like we've been trained as consumers to the point where saying no isn't realistic anymore, there must be something else to buy that represents me more.. etc
Software must be open source by law. Apple has shown closed software is too dangerous.
Stop keeping your needs of a computer locked behind service agreements.
I think iPhone is superior and would hate to leave it. Although I barely do much with my phone outside 2FA, browsing, and texting. A switch won't be too bad in that regard.
I do love my Macbooks though. So while these privacy invasions make me angry I'm not willing to drop Apple all together. I've been meaning to keep most of my sensitive information on a usually disconnected Linux machine anyways. I'll keep using my Macbook for development.
I see this Apple move as a warning. I have lived part of my life under communistic regime.
For me my Apple addiction ends here. There is no "magic" left in their products, only "bait & switch" dark patterns.
No hardware or UX will lure me again to suppress my instincts.
This is the beginning of global politically and financially motivated race for public control. Apple is just giving a spark to the fire. Imagine a future in which your beloved Face ID will be tied to everything, your beloved iDevices, Teslas, or home appliances will scanning and reporting, scanning and reporting. There is no middle ground in this for me. No benefits or conveniences are so important. FOSS and public oversight of software must be demanded by law.
Posted earlier this without getting any reactions. https://docplayer.net/1287799-Fourth-amendment-search-and-th...
I don’t understand why this outrage seems so US-centric. This is the same Apple that hands over all your iCloud data (photos and otherwise) to the CCP if you happen to live in China. And they’ve done this openly for the last several years.
What am I missing? Isn’t that a much much much much worse thing for Apple to do? Why are we only suddenly suspicious of Apple’s privacy claims with this matter?
As an American I couldn’t care less that Chinese people’s data is handed over to Chinese people’s government, especially considering the alternative would be that Chinese people’s data is handed over to a US entity and by association the US government.
Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests. *
Considering this fact, it is pretty one-sighted to see this as some sort of unconceivable act. However if you look at it from the other side would we want all American user data (assuming Russia had a company that had such pervasive penetration into American lives as Apple does globally) to be sitting on Russian servers subject to arbitrary Russian laws?
So if you only consider American interests, it’s unconceivable for us to give up such power and control over other sovereign nations, but perhaps other countries don’t care about American interests like we do.
All Apple did in China was comply with local laws to stay in business there. What Apple is doing in the US is not mandated by law (as far as I know).
From the American side Apple has marketed itself as privacy focused, even fighting the FBI publicly at the risk of negative publicity. This about face is unexpected but also betrays those of us who invested in the Apple product line under the expectation they continue this standard of privacy and security that was marketed. Chinese people probably never expected this level of privacy to begin with, but we did and we can.
* iCloud messages backups can be decrypted
This to me is a surprising attitude. As an American whose outlook is generally framed by American values, it’s very upsetting to think of how privacy and freedoms are systematically impinged upon in so many parts of the world. If we can be upset about invasion of privacy in one country, why would those principles change at geopolitical borders?
16 replies →
> Chinese people probably never expected this level of privacy to begin with, but we did and we can.
Taiwanese users data is also backed up to China. This was confirmed to me by an Apple Support in China. Whether you believe Taiwan is part of China or not, I can assure you users in Taiwan do expect this level to privacy.
What we are seeing is a slow deterioration of user privacy across the Apple ecosystem, not just the US. So even if you don’t care about Chinese users data, it does show what Apple management as a whole thinks of your data.
Tic-toc
> Contrary to popular belief, iCloud data, while encrypted, can be decrypted by Apple and is subject to US law enforcement requests.
The most seem to forget, that with this newcoming feature this is not possible anymore. Apple can’t decrypt your images anymore by request. (Read Apple’s PSI system)
There is also strong evidence that same is coming for backups. On iOS 15 beta, there is backup recovery option by authentication key.
9 replies →
Apple didn’t wake up one day and decide to do this on a lark.
They’re being proactive, probably in a minimalist form, to anticipate regulatory powers on what is unarguably the largest or second largest platform used for illegal porn.
If FB screeners have ptsd and are killing themselves over what they have to see every day, imagine what is on iCloud and iPhones. Right now, nobody is required to filter that content while social media is. The alternative to “sure, you tell us what is illegal and we’ll scan for it” is “We’re the govt and we want to see everyones photos for the children.”
Sure, the latter may still happen, but probably later than sooner now. I’m surprised it has taken this long,
1 reply →
It's my understanding that Apple simply can't operate in China without playing by those rules. So really, the onus is on the CCP.
In this case, is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.
I agree the question of whether Apple was compelled by whatever government (or if they did this voluntarily) has implications on the ethics of these decisions. They may genuinely have no choice.
But I don’t see how it affects the question of whether Apple’s privacy assertions are trustworthy.
4 replies →
Maybe similar pressure was placed on them here and we just aren't privy to it.
19 replies →
> is Apple being compelled to do this by the US government? Or is it a choice Apple has made purely internally? I think that makes a difference.
You're being downvoted but it's a critical issue.
If Apple is currently being compelled to do this, it likely means the US Government has a massive new privacy obliterating program underway and Apple probably isn't the only tech giant joining the human rights violation parade. It's important to find out if that's going on. We can be certain they didn't stop with PRISM.
If it turns out to be the case, that Apple has joined up to another vast human rights violating program (they already did it at least once before, remember), the US needs to move forward toward Nuremberg-style trials for all involved Apple management and all involved Apple employees (and not only them). That's the only way it stops.
Such human rights violations should not be allowed to continue. How many tech employees at these companies got away with extraordinary human rights violations related to PRISM? Employees at these companies were responsible in part and critical to helping to make it happen. Who are these enablers? Why aren't they in prison? Why is this so rarely discussed on HN? (yeah we all know why)
HN is pretty amusing about this topic. Privacy is a human right? Yeah? Also universally HN: but let's not talk about the people actually responsible for the human rights violations; let's not talk about all the techies being paid princely sums to commit human rights atrocities. Let's not talk about prison sentences for what they've done to their fellow humans. Let's not hold tech employees responsible.
1 reply →
I imagine because most of us care more about US policies in general than Chinese because most of us live in the US. If fixing Chinese lack of free communication were on the table I'm sure we'd mostly be for it, but that's a whole other thing that ultimately goes back to their government.
I mean, that’s also bad? But the CCP is not going to budge on this, and it doesn’t affect me as much as an American, so I feel like I can be upset about both and more upset about the one that affects me directly.
Because most of Chinese keep silent with their government?
People care more about what happens to them than others and people care more about what happens where they are than elsewhere.
As far as is documented, the behavior of iCloud does not change, just the operator. In particular, the difference is that end to end encrypted data in iCloud remains that way, so saying all iCloud data is handed over is incorrect.
In fact, iMessage is the only end to end encrypted messaging service operating in the country (for example).
It’s my understanding that the keys used in that “end-to-end” encryption are also under the control of the operator [1], so from a privacy perspective it is the same as handing over that data in plaintext.
[1] https://www.nytimes.com/2021/05/17/technology/apple-china-ce...
4 replies →
> the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.
A very overdramatic sentence. It is a bit scary to realise that only now people think that this border is being crossed. It has happened a very long time ago already. The first years of Android, owners were the product, not the phone. Privacy features in the past years might have improved this a little.
Google’s massive success on many services is based on the fact how phones and their software were collecting data for them. User interfaces are just illusions for non-tech persons. They might give you a sense of control.
Now that Apple does not trust us with CSAM material, the end is near. There are arguments for both sides, and many are taking sides to just get attention.
However, you can only solve this problem with politics.
This whole switch to Linux is not a solution.
Privacy advocates need to be like second amendment activists. We need to use their playbook. They raise a big stink about anything, no matter who big or small, that could curtail their rights. No number of Sandy Hook events will result in meaningful changes in laws.
Pushing everyone to Linux will eventually lead to all hardware falling under some national security law, allowing hardware to be imported if only they allow certain OSs to be installed on them and boot loaders will be locked.
Free market has no impact here, the masses don't care. And privacy supporters are too logical to whip up any type of movement.
Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.
> Till privacy advocates come up with emotional reasons why privacy is absolutely necessary (like grandma is gonna die without it), this is a losing battle.
They have already come up with good reasons.
"Every time you use encryption, you're protecting someone who needs to use it to stay alive." -- Bruce Schneier
"“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” -- Edward Snowden
Surveillance harms journalism and activism, making the government too powerful and not accountable. If only activists and journalists will try to have the privacy, it will be much easier to target them. Everyone should have privacy to protect them. It’s sort of like freedom of speech is necessary not just for journalists, but for everyone, even if you have nothing to say.
> Pushing everyone to Linux will eventually lead to all hardware falling under some national security law
I don't see any connection here. Linux is already used on all servers and nothing happens.
The problem is that the NRA is not big because of the people. It is big because of gun companies. Thats why their budget is 20x larger than EFF's.
Privacy doesn't move units like gun rights move guns. Until this changes, privacy is a lost cause in the digital world.
Dude can write, too.
Good job, ES.
Apple prob going to about-face like OnlyPorn
But it should set off some soul searching in the tech community at least about the consolidation of power
Sigh.
You poor fools.
I guess us poor fools?
I haven't thought the day will come when Apple news brings up very strongly, as a knee jerk reaction, one short part of a one long sentence Hungarian poem indeed called One Sentence On Tyranny. It is a poem many of us know from heart, to remember what was even if many back in Hungary forgot. I fail to convey my emotions here in just a few words but I am incredibly saddened.
Anyways, I checked a few translations, they lose some of the power of the original but let me try, the first two lines only for context:
[...] you would like to look, but you can only see
what Tyranny conjured up for you
already forest fire surrounds you
fanned into flame from a matchstick
you threw down without stamping it out
Oh yes, nothing new under the Sun. And it might be too late now.
These things have been so much on my mind because I saw an anti masker protest in Vancouver peacefully escorted by police. My mind melted. I remember, remember all too well, it was only 35 years ago when Hungarian police have broken up a protests with batons -- it's called the Battle Of the Elizabeth Bridge to this day. It was a very one sided battle, mind you. And while I don't remember, my parents do when they broke it up with tanks....
This is what happens when a businessman takes over a founder led company. Soulless and liars are the only the words which comes to my mind.
I wonder if any current or future President might revisit the idea of granting Snowden a pardon. He is is still viewed unfavorably by a fair amount of the US population, but it seems like that's changing with time.
Around here, people view him favorably but are wary because his choices make him look like an agent for Russia. I think it would change only after a pardon.
So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud?
Isn't this essentially what might happen with so-called ChatControl in Europe?
https://www.patrick-breyer.de/en/posts/message-screening/?la...
I was surprised to not see an RSS link there, but apparently a feed does exist:
https://edwardsnowden.substack.com/feed
Are there any decent open source alternatives out there we can contribute to?
A quick search revealed that there is Librem 5, a Linux based smart phone. I would love to hear more about it.
https://en.m.wikipedia.org/wiki/Librem_5
There’s also Pine Phone.
https://www.pine64.org/pinephone/
I think 1TB[^] would be enough to hold most people's privacy concerns.
[^]: https://www.macrumors.com/2021/08/17/iphone-13-third-week-se...
My question after reading this article is, what can I do as a current iPhone user? Is there a reasonable alternative phone out there that can run the same apps but that isn't effectively controlled by Apple or Google?
Simple: don't store your photos on Apple's property (iCloud). That's the only time this fingerprinting will affect you. If you store photos locally, then no type of fingerprinting will happen.
While you're at it, don't store them on anyone's servers, because they all fingerprint for the same exact reasons. There's no service out there that doesn't do this at some level.
Sorry, I should have been more clear in my post. I meant to say that, if I no longer want to support Apple (or Google) because of their behavior, are there any reasonable smartphone options available? It seems to me that this is the only way I can protest against what Apple is doing.
1 reply →
What would happen if in a new paradigm shift multi-nationals decided to use their enormous lobbying power to push back on the government in favor of their users for once instead of only lobbying to screw them over?
Great to see someone influential framing it this way. At the end of the day, what it's there for doesn't matter and unfortunately way too many of us stumble on this mistake in reasoning.
Are there any self-hosted equivalents to icloud photos? (e.g. automated backup/sync. Does the camera app have a way to save to other places, etc?)
I think your best bet might be NextCloud. You can either host it yourself on a storage VPS or find a provider who provides managed NextCloud (or however they repackage it.) Be aware there are many more providers than listed on the NextCloud website and you'd do well to search about. If you're in Europe, take a look at Hetzner's "Storage Share" as an example of what's possible and prices for high-quality hosting, but like I said there's a million operators out there doing this and you can get it dirt cheap if you want.
I don't know how it works on an iPhone but on Android NextCloud detects when save pictures in a different directory due to some app and asks me if I want it to track that directory. It was already configured by default to track stuff taken with the default camera app.
Synology disk station software does this, but that's a significant investment. If you open the DS file software it defaults to syncing your photos. I don't run mine public, so I've never checked how their VPN/cloud sync thing works.
If I want to share a photo from my phone I use syncthing hosted on one of my VMs on a server I bought and built, but that I don't have physical access to easily (I'll never see it, probably). At home to share a photo I either use mattermost to get a public link to the stored image or ssh to the same box as syncthing runs on. I also host mattermost, on a different VM on a different server in the same datacenter.
I don't like apps seeing my stuff so I just don't use stuff like imgur or whatever.
OwnCloud
A little melodramatic, but the guy is a talented writer.
This is your chance to break free. Please consider switching to an opensource OS. Our freedom depends on it.
If you hear a loud bang, that will be the RMS smug-o-meter exploding.
Edward Snowden just went full Stallman. Take a good idea and ride it over the cliff of sanity.
Of course privacy can't be absolute, we live in a society. Be realistic. Focus on evil things. If you think Apple have an evil plan, sure, but most people who object don't even think that.
And for a tech spy he seems to not understand tech. "What if some evil regime wants Apple to scan for anti-government propaganda". Well then they won't be using the CSAM system that's for sure, they can just scan the images directly, either on iCloud or on device. Co-opting the CSAM scanner is probably the most impractical way imaginable to spy on Uighur separatists.
Let me see: Who I will trust more about "security" or "privacy"?
Some guy on hackernews, full with well payed Apple employees or renowned spy?
Hard thing to do? Right?
I'm not saying you should "trust" me, or anyone. Just consider the facts:
A)
- Apple have complete control over the hardware, the OS and all the most popular apps, including Photos and Mail.
- They also have complete control over iCloud, which is not encrypted
- They can and do scan your photos and emails so that they can classify photos, find possible appointments, emails etc, and now they even OCR your photos.
B)
- They are now building a very high profile, limited and locked in system that relies on hashes, external databases, a large number of matches, human review etc.
Do you really think they would use B if they, the FBI, the Chinese government or whoever, would want to spy on users? For all we know they are already spying, it would be completely trivial to do so. Clearly system B is a complete red herring when it comes to spying. They don't need it.
2 replies →
IBM gleefully cooperated with the Nazis. It won’t be long until Apple is using this framework to alert the PRC about Chinese dissidents. Just to merely stay in their market.
iOS 15 effectively is the point where Apple kicks off the holocaust that they’re going to be responsible for. I hope they enjoy their place in history because they’re earning it. I’ve loved my iPhones, but there’s a warm place in hell waiting for all Apple employees involved in this endeavor.
Apple is an enemy more threatening to mankind’s freedoms than Al Qaeda ever could’ve been.
i(lluminati) phone.
How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?
George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.
We are the slow boiled frog as the most expansive totilitarian infrastructure in history is built up.
> How did we get to place (speaking of America where there is a bill of rights) where it is normal to have a multimodal tracker on or near your person at all times?
> George Orwell's telescreen at least could not fit in your pocket and Orwell never imagined things like GPS or facebook or digital phones.
Because we're not living in 1984's dystopia where the government oppresses us, we're living in Brave New World's dystopia where we choose to oppress ourselves.
Phones are not mandatory, if you care about privacy throw your phone in the bin (better, someone else's bin).
Snowden should get an editor or stick to tweets or video interviews.
[dead]
are we close enough for the government to scan anything passing its road?
Apple has through side channels leaked iCloud is the largest open host of CSAM among big tech. It's the only large provider that hosts images that doesn't automatically scan. The only difference is Apple wants to do it while leaving your photos in the cloud encrypted. This isn't rational, it's an anti-Apple culture war position.
This entire argument is based on the premise that Apple only scans photos that the user has requested to be uploaded to iCloud, and will continue to do so.
I don't think many people believe that anymore. Not even the Apple's goodwill, necessarily - but that, once the system is in place and normalized, the governments won't mandate it and extend its scope by legislative fiat.
Are the photos in iCloud actually encrypted, though? As far as I’m aware, a government agency can subpoena them already. I’m not sure I agree with the slippery-slope argument, but I’m still failing to see how Apple’s current security model prevents them from performing the hashing on iCloud servers and avoiding all this drama.
And there's no reason they can't scan the images on their servers.
Can you point to any examples of these leaks?
(Edit: Thanks for the links.)
It’s not a leak. It’s an observation from evidence that came out during discovery in the Apple vs. Epic trial that’s being reported now that it’s useful context: https://www.forbes.com/sites/johnkoetsier/2021/08/19/apple-e...
https://www.theverge.com/22611236/epic-v-apple-emails-projec...
#71, from the Epic v. Apple anti-trust trial discovery.
4 replies →
https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_phot...
I'd much prefer they scan their shared albums on the server.