> You are Apple. You want to make search work like magic in the Photos app, so the user can find all their “dog” pictures with ease.
What if you're a user and you don't care about searching for "dog" in your own photos, you might not even use the Photos app, but apple still scans all your photos and sends data off device without asking you?
Perhaps this complicated dance works, perhaps they have made no mistakes, perhaps no one hacks or coerces the relay host providers... they could still have just asked for consent the first time you open Photos (if you ever do) before starting the scan.
Exactly, I don't want my shit sent all across the internet without my explicit prior consent, period. No amount of explanation can erase Apple's fuck-up.
Google, on the other hand, uploads photos to their server and does the analysis there.
There is the infamous case of the parents who Google tried to have arrested after they used their Android device to seek medical assistance for their child during lockdown. Their doctor asked them to send images of the problem, and Google called the police and reported the parents for kiddie porn.
> “I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”
Not wrong, but it’s interesting that Apple gets so much flak for this when Google and Microsoft don’t even try. If anything they try to invade privacy as much as possible.
Of course maybe that question has its own answer. Apple markets itself as the last personal computing company where you are the customer not the product so they are held to a higher standard.
What they should do
Is do the processing locally while the phone is plugged in, and just tell people they need a better phone for that feature if it’s too slow. Or do it on their Mac if they own one while that is plugged in.
A quick shoutout to Ente Photos[0]. FOSS with an opt-in locally-run semantic search of your photos. The first encoding with a ton of photos may take a few minutes in the background, but after that it takes no time with subsequent photo uploads. I'm not sure why Apple is going through the trouble of uploading the photos and incorporating homomorphic encryption for something this simple, particularly with their push for local AI and their Neural Engine[2].
I also appreciate Ente's Guest View[1] that lets you select the photos you want to show someone in person on your phone to prevent the issue of them scrolling too far.
Trust is never all or nothing. I trust Apple to an extent, but trust needs to be earned and maintained. I trust my mom, but if she suggested installing video cameras in my home for my "safety", or worse, she secretly installed video cameras in my home, then she would lose my trust.
Likewise, you need to trust your spouse or significant other, but if there are obvious signs of cheating, then you need to be suspicious.
An essential part of trust is not overstepping boundaries. In this case, I believe that Apple did overstep. If someone demands that you trust them blindly and unconditionally, that's actually a sign you shouldn't trust them.
How can you trust any mainstream "working" iPhone or Android device? You already mentioned open source android distros. You mean those where no banking or streaming device app works because you have to use a replacement for gapps and the root / open bootloader prevents any form of DRM? That is not really an option for most people.
I would love to have a Linux phone even with terrible user experience as long as I do not lose touch with society. That however seems to be an impossible task.
Android does this too. I don't really want all my photos indexed like that, I just want a linear timeline of photos, but I cant turn off their "memories" thing or all the analysis they do to them
Granted they require you to opt-in in order for the photos app to be usable & if you go out of your way to avoid opting in they make your photo-browsing life miserable with prompts & notifications. But you do still have to opt-in.
I don't think Android does that. It's only Google Photo and only if you upload them to the cloud, if you don't sync/upload them, you can't search them with specific terms.
Samsung at least does these "dog" cataloguing & searches entirely on-device, as trivially checked by disabling all network connectivity and taking a picture. It may ping home for several other reasons, though.
The "memories" part can be trivially done locally and probably is, it's really just reading the picture's "date taken", so it's conceptually as easy as a "sort by date". My old Android with whatever Photos app came with it (not Google's) shows this despite being disconnected for so long.
There's nothing stopping either Apple or Google from giving users an option to just disable these connected features, globally or per-app. Just allow a "no cloud services" toggle switch in the Photos app, get the warning that $FEATURES will stop working, and be done.
I know why Google isn't doing this, they're definitely monetizing every bit of that analyzed content. Not really sure about Apple though, might be that they consider their setup with HE as being on par with no cloud connectivity privacy wise.
uninstall(disable) stock Google Photos app and install `gallery2.apk`. You can download one from sketchy github repos, or I think you can alternatively extract from Emulator image.
What if you're a user and you're fed up with all the "magic"? What if you want a device that works reliably, consistently, and in ways you can understand from empirical observation if you pay attention?
Apple, Google, Microsoft and Samsung, they all seem to be tripping over each other in an effort to make this whole thing just as much ass-backwards as possible. Here is how it, IMHO, should work:
1) It scans stuff, detects faces and features. Locally or in the cloud or not at all, as governed by an explicit opt-in setting.
2) Fuck search. Search is not discoverable. I want to browse stuff. I want a list of objects/tags/concepts it recognized. I want a list of faces it recognized and the ability to manually retag them, and manually mark any that they missed. And not just a list of 10 categories the vendor thinks are most interesting. All of them. Alphabetically.
3) If you insist on search, make it work. I type in a word, I want all photos tagged with it. I click on a face, I want all photos that have matching face on it. Simple as that. Not "eventual consistency", not "keep refreshing, every 5th refresh I may show you a result", or other such breakage that's a staple experience of OneDrive Photos in particular.
Don't know about Apple, but Google, Microsoft and Samsung all refuse #2, and spectacularly fail at #3, and the way it works, I'm convinced it's intentional, as I can't even conceptualize a design that would exhibit such failures naturally.
EDIT:
4) (A cherry on a cake of making a sane product that works) Recognition data is stored in photo metadata, whether directly or in a sidecar file, in any of a bunch of formats sane people use, and is both exported along with the photos, and adhered to when importing new photos.
> What if you're a user and you're fed up with all the "magic"?
This is a completely hypothetical scenario. If users with such requirements actually existed, PinePhones and similar devices would be significantly more popular.
Well, not vouching for automated scanning or whatever, but the advantage of homomorphic encryption is that besides the power usage for the computation and the bandwidth to transmit the data, Apple doesn't learn anything about what's in your photos, only you can. So even if you don't use the feature, the impact is minimal for you
It's opt-in by default so you can't "bypass" it unless you are aware that you can turn it off. If you don't turn it off, it will continue to scan your photos, and upload the data to Apple, whether you use the Photos app or not. (And, by the way, if the option to "learn from this app" is enabled (which is again, by default opt-in) iPadOS / ios also will be intrusively data collecting how you use that alternative camera app too ...
I didn't like their license because it's BSD-3-Clause-Clear but then they state:
"Zama’s libraries are free to use under the BSD 3-Clause Clear license only for development, research, prototyping, and experimentation purposes. However, for any commercial use of Zama's open source code, companies must purchase Zama’s commercial patent license"
So Its not free, you need to pay for patent license, and they don't disclose how much.
I recommend OpenFHE as an alternative Free open source solution.
I know its C++ and not Rust, but no patent license and it can do the same thing the blog post wants to do, it even has more features like proxy-reencryption that I think Concrete can't do.
> If a company open sources their code under BSD3-clear, they can sell additional licenses to use the patents included in the open source code. In this case, it still isn’t the software that is sold, but rather the usage rights of the patented intellectual property it contains.
Concrete from Zama is a completely different algorithm to the one used in this product; the former uses the CKKS algorithm, while the latter is the BFV algorithm.
Zama has already hit competitors with (French) patent charges. Apple's HE implementation is in Swift and uses BFV, which is very different HE from anything Zama does and doesn't use their source.
> The two hops, the two companies, are already acting in partnership, so what is there technically in the relay setup to stop the two companies from getting together—either voluntarily or at the secret command of some government—to compare notes, as it were, and connect the dots?
The OHTTP scheme does not _technically_ prevent this.
It increases the number parties need to cooperate to extract this information, hoping it would be caught somewhere in the pipeline.
and if a government say already forced ISPs to collect metadata about who connects to whom and when, I imagine they don't even need to bother getting data from the relay hosts
It's cool how neural networks, even convulutional ones, are one of the few applications that you can compute through homomorphic encryption without hitting a mountain of noise/bootstrapping costs. Minimal depth hurrhah!
I don't think Apple is doing this. They compute the embedding on device in the clear, and then just do the nearest-neighbor part in HE (which does lots of dot products but no neural network).
There are people doing NNs in HE, but most implementations do indeed require bootstrapping, and for that reason they usually use CKKS.
I was going to make my usual comment of FHE being nice in theory but too slow in practice, and then the article points out that there’s now SHE (somewhat homomorphic encryption). I wasn’t aware that the security guarantees of FHE could be relaxed without sacrificing them. That’s pretty cool.
Is there any concrete info about noise budgets? It seems like that’s the critical concern, and I’d like to understand at what point precisely the security breaks down if you have too little (or too much?) noise.
SHE vs FHE has nothing to do with security. Instead, it’s about how many operations (eg homomorphic multiplications and additions) can be performed before the correctness of the scheme fails due to too much noise accumulating in the ciphertext. Indeed all FHE schemes are also SHE schemes.
What typically makes FHE expensive computationally is a “bootstrapping” step for removing the noise that accumulated after X operations and threatening correctness. After bootstrapping you can do another X operations. Rinse and repeat until you finish the computation to you want to perform.
Im not an expert on this, but my understanding is "noise" is less a security breakdown and more the entire system breaksdown. That is where the "somewhat" comes in, unlike "full" where the system can do (expensive) things to get rid of noise, in somewhat the noise just accumulates until the system stops working. (Definitely talking out of my hat here)
Interesting! Are there any security concerns with SHE? If not, it sounds like all of the benefits of FHE with none of the downsides, other than the noise overwhelming the system. If that’s true, and SHE can run at least somewhat inexpensively, then this could be big. I was once super hyped about FHE till I realized it couldn’t be practical, and this has my old excitement stirring again.
it's incredibly algorithm-dependent. if you look into the thesis that originates the 'bootstrapping' technique to transform SHE algorithms into FHE, they determine the noise limit of their specific algorithm in section 7.3 and then investigate expanding the noise limit in 8 and 10.
Not sure whether they use FHE or not (the literature I’m looking at simply says “homomorphic”), but we use ElectionGuard at our company to help verify elections for our customers. So there are definitely some practical uses.
> One example of how we’re using this implementation in iOS 18, is the new Live Caller ID Lookup feature, which provides caller ID and spam blocking services. Live Caller ID Lookup uses homomorphic encryption to send an encrypted query to a server that can provide information about a phone number without the server knowing the specific phone number in the request.
There is another reason which I dislike this which is that now Apple has reason for "encrypted" data to be sent randomly or at least every time you take a picture. If in the future they silently change the photos app (a real risk that I have really emphasized in the past) they can now silently pass along a hash of the photo and noone would be the wiser.
If an iPhone was not sending any traffic whatsoever to the mothership, at least it would ring alarm bells if it suddenly started doing so.
Isn't this the same argument that they can change any part of the underlying OS and compromise the security by exfiltrating secret data? Why specific to this Photos feature.
No. GP means that if the app was not already phoning home then seeing it phone home would ring alarm bells, but if the app is always phoning home if you use it at all then you can't see "phoning home" as an alarm -- you either accept it or abandon it.
Whereas if the app never phoned home and then upon upgrade it started to then you could decide to kill it and stop using the app / phone.
Of course, realistically <.00001% of users would even check for unexpected phone home, or abandon the platform over any of this. So in a way you're right.
Any app can do this really, just can’t update the entitlements and a few other things. I would think it unlawful for Apple’s own apps to have access to functionality/apis that others don’t…
I love homomorphic encryption, but why can't they just do a neural search locally?
- iOS Photos -> Vectors
- Search Query "Dog photos" -> Vectors
- Result (Cosine Similarity): Look some dog photos!
iPhones have plenty of local storage and compute power for doing this kind of thing when the phone is idle. And cosine similarity can work quickly at runtime.
> This seems like a lot of data the client is getting anyway. I don’t blame you for questioning if the server is actually needed. The thing is, the stored vectors that are compared against are by far the biggest storage user. Each vector can easily be multiple kilobytes. The paper discusses a database of 35 million entries divided across 8500 clusters.
Because the blog post needs some sort of concrete example to explain, but all concrete examples of fully-homeomorphic encryption are generally done better locally at the moment due to the extreme costs of FHE.
This would be even more exciting if there were some way to guarantee your phone, the servers, etc. are running untampered implementations, and that the proxies aren't colluding with Apple.
If someone or something can tamper with your phone, then nobody needs to collude with proxies or Apple. They can just ask your phone to send them exactly what they want, without all the homomorphic encryption dance.
The idea that Apple is going to use this feature to spy on you, completely misses the fact that they own the entire OS on your phone, and are quite capable of directly spying on you via your phone if they wanted to.
Upgrades have to be possible. What you want probably is attestation that you're running a generally available version that other users run too as opposed to one specially made for you, but since a version could be made for all those subject to surveillance this wouldn't be enough either.
I'm not sure there's a way out of this that doesn't involve open source and repeatable builds (and watch out for Reflections on Trusting Trust).
I’ve been using it a lot recently. Multiple times even today while I’ve been trying to find just the right photos of my theater for a brochure I’m putting together. I have over 100,000 photos in Apple photos so even if I vaguely remember when I took a photo it’s still difficult to find it manually.
As a concrete example, someone on my team today asked me “can you send me that photo from the comedy festival a couple years ago that had the nice projections on the proscenium?”. I searched apple photos (on my phone, while hiking through a park) for “sketchfest theater projection”. It used the OCR to find Sketchfest and presumably the vector embeddings of theater and projection. The one photo she was referring to was the top result. It’s pretty impressive.
It can’t always find the exact photo I’m thinking of the first time, but I can generally find any vaguely-remembered photo from years ago without too much effort. It is pretty magical. You should get in the habit of trying it out, you’ll probably be pleasantly surprised.
Do you mean the ability to search in Apple Photos is “privacy-bruising”, or are you referring to landmark identification?
If the latter, please note that this feature doesn’t actually send a query to a server for a specific landmark — your device does the actual identification work. It’s a rather clever feature in that sense…
Is the Apple Photos feature mentioned actually implemented using Wally, or is that just speculation?
From a cursory glance, the computation of centroids done on the client device seems to obviate the need for sending embedded vectors of potentially sensitive photo details — is that incorrect?
I’d be curious to read a report of how on-device-only search (using latest hardware and software) is impacted by disabling the feature and/or network access…
Thank you! This is exactly the information the OP seems to have missed. It seems to confirm my suspicion that the author’s concerns about server-side privacy are unfounded — I think:
> The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate…
[please correct me if I missed anything — this used to be my field, but I’ve been disabled for 10 years now, so grain of salt]
You have to be quick if you want to disable the feature, as the scan starts on OS install, and disabling requires you to actively open the Photos app and turn it off.
I'm not an expert in homomorphic encryption by any stretch (and I'm arguably the target audience for this blog post — a curious novice), but there's one thing I don't quite get from this post.
In the "appeal to cryptographers" section (which I really look forward to being fulfilled by someone, hopefully soon!), HE is equated to post-quantum cryptography. As far as I know, most current post-quantum encryption focuses on the elimination of Diffie-Hellman schemes (both over finite fields and over elliptic curves) since those are vulnerable to Shor's algorithm.
However, it's clear from the code samples later in the post (and not explained in the text, afaict) that a public key gets used to re-encrypt the resultant value of a homomorphic add or multiply.
Is this a case of false equivalence (in the sense that HE != post-quantum), or is it more the case that there's some new asymmetric cryptography scheme that's not vulnerable to Shor's?
All modern HE schemes rely on post-quantum crypto. For example, the ring-LWE problem used by BFV is the same as Kyber (ML-KEM) but with different parameters.
The twist in FHE is that the server also has an encryption of the user's secret key, which adds an assumption called "circular security", and that's needed to do some homomorphic operations like key switching.
You don't need to search much more to find out how to spy or track your boyfriend's phone.
Given that males are more likely than women to cheat and that cheating rates are higher than ever, it makes sense that you would want to understand how to track your partner's phone. To spy on your lovers or anybody else, please mail her. Infocyberrecoveryinc@gmail.com
This is so cool! I first learned about homomorphic encryption in the context of an election cybersecurity class and it seemed so pie-in-the-sky, something that would unlikely be used for general practical purposes and only ever in very niche areas. Seeing a big tech company apply it in a core product like this really does feel like a step in the right direction towards taking back some privacy.
> This should be fine: vectorization is a lossy operation. But then you would know that Amy takes lots of pictures of golden retrievers, and that is a political disaster.
This downplays the issue. Knowing that Alice takes lots of screenshots of Winnie the Pooh memes means that Alice’s family gets put into Xinjiang concentration camps, not just a political disaster.
(This is a contrived example: iCloud Photos is already NOT e2ee and this is already possible now; but the point stands, as this would apply to people who have iCloud turned off, too.)
Agreed. And for a less contrived example, people may have photos of political protests that they attended (and the faces of others present), screenshots that include sensitive messages, subversive acts, etc.
It's worth noting though that it's now possible to opt in to iCloud Photo e2ee with "Advanced Data Protection". [0]
iCloud Photo e2ee still shares hashes of the plaintext with Apple, which means they can see the full list of everyone who has certain files, even if they all have e2ee enabled. They can see who had it first, and who got it later, and which sets of iCloud users have which unique files. It effectively leaks a social graph.
This is about additional new photo search capabilities that are enabled by default and powered by sending (encrypted) data derived from your photos to the cloud, not locally powered AI search.
*Homomorphically encrypted. Be precise. Deliberately handwaving it away as "mere" encryption paints a worse picture than it actually is. Yes, possibly should have been opt-in, and a more human-friendly explanation of how it works would help. However, it's not nearly as bad as HN would have you believe. Very much a case of them pointing out that the emperor is naked, when in fact they have a few shirt buttons missing...
I mean if you'd like, you could reimplement the necessary client on an airgapped computer, produce an encrypted value, take that value to a networked computer, send it to the server, obtain an encrypted result that the server could not possibly know how to decrypt, and see if it has done the computation in question. No trust is required.
You could also observe all bits leaving the device from the moment you initialize it and determine that only encrypted bits leave and that no private keys leave, which only leaves the gap of some side channel at the factory, but you could perform the calculation to see that the bits are only encrypted with the key you expect them to be encrypted with.
> You are Apple. You want to make search work like magic in the Photos app, so the user can find all their “dog” pictures with ease.
What if you're a user and you don't care about searching for "dog" in your own photos, you might not even use the Photos app, but apple still scans all your photos and sends data off device without asking you?
Perhaps this complicated dance works, perhaps they have made no mistakes, perhaps no one hacks or coerces the relay host providers... they could still have just asked for consent the first time you open Photos (if you ever do) before starting the scan.
Exactly, I don't want my shit sent all across the internet without my explicit prior consent, period. No amount of explanation can erase Apple's fuck-up.
Apple does photo recognition on your device.
Google, on the other hand, uploads photos to their server and does the analysis there.
There is the infamous case of the parents who Google tried to have arrested after they used their Android device to seek medical assistance for their child during lockdown. Their doctor asked them to send images of the problem, and Google called the police and reported the parents for kiddie porn.
> “I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”
The police agreed. Google did not.
https://www.nytimes.com/2022/08/21/technology/google-surveil...
Google refused to return access to his account even after the police cleared him of wrongdoing.
17 replies →
They are not sending your actual photo, as has been covered at length on numerous threads on this very site.
6 replies →
Not wrong, but it’s interesting that Apple gets so much flak for this when Google and Microsoft don’t even try. If anything they try to invade privacy as much as possible.
Of course maybe that question has its own answer. Apple markets itself as the last personal computing company where you are the customer not the product so they are held to a higher standard.
What they should do Is do the processing locally while the phone is plugged in, and just tell people they need a better phone for that feature if it’s too slow. Or do it on their Mac if they own one while that is plugged in.
15 replies →
Doesn't Photos.app on iOS sync with iCloud OOTB?
2 replies →
A quick shoutout to Ente Photos[0]. FOSS with an opt-in locally-run semantic search of your photos. The first encoding with a ton of photos may take a few minutes in the background, but after that it takes no time with subsequent photo uploads. I'm not sure why Apple is going through the trouble of uploading the photos and incorporating homomorphic encryption for something this simple, particularly with their push for local AI and their Neural Engine[2].
I also appreciate Ente's Guest View[1] that lets you select the photos you want to show someone in person on your phone to prevent the issue of them scrolling too far.
[0] https://github.com/ente-io/ente
[1] https://ente.io/blog/guest-view/
[2] https://en.wikipedia.org/wiki/Neural_Engine
It doesn't really matter if they ask you or not, ultimately you have to trust them, and if you don't trust Apple, why would you even use an iPhone?
Trust is never all or nothing. I trust Apple to an extent, but trust needs to be earned and maintained. I trust my mom, but if she suggested installing video cameras in my home for my "safety", or worse, she secretly installed video cameras in my home, then she would lose my trust.
Likewise, you need to trust your spouse or significant other, but if there are obvious signs of cheating, then you need to be suspicious.
An essential part of trust is not overstepping boundaries. In this case, I believe that Apple did overstep. If someone demands that you trust them blindly and unconditionally, that's actually a sign you shouldn't trust them.
4 replies →
How can you trust any mainstream "working" iPhone or Android device? You already mentioned open source android distros. You mean those where no banking or streaming device app works because you have to use a replacement for gapps and the root / open bootloader prevents any form of DRM? That is not really an option for most people. I would love to have a Linux phone even with terrible user experience as long as I do not lose touch with society. That however seems to be an impossible task.
13 replies →
As they didn't ask, I will trust them less
7 replies →
Android does this too. I don't really want all my photos indexed like that, I just want a linear timeline of photos, but I cant turn off their "memories" thing or all the analysis they do to them
Android doesn't do this. Everything is opt-in.
Granted they require you to opt-in in order for the photos app to be usable & if you go out of your way to avoid opting in they make your photo-browsing life miserable with prompts & notifications. But you do still have to opt-in.
14 replies →
I don't think Android does that. It's only Google Photo and only if you upload them to the cloud, if you don't sync/upload them, you can't search them with specific terms.
Samsung at least does these "dog" cataloguing & searches entirely on-device, as trivially checked by disabling all network connectivity and taking a picture. It may ping home for several other reasons, though.
4 replies →
The "memories" part can be trivially done locally and probably is, it's really just reading the picture's "date taken", so it's conceptually as easy as a "sort by date". My old Android with whatever Photos app came with it (not Google's) shows this despite being disconnected for so long.
There's nothing stopping either Apple or Google from giving users an option to just disable these connected features, globally or per-app. Just allow a "no cloud services" toggle switch in the Photos app, get the warning that $FEATURES will stop working, and be done.
I know why Google isn't doing this, they're definitely monetizing every bit of that analyzed content. Not really sure about Apple though, might be that they consider their setup with HE as being on par with no cloud connectivity privacy wise.
2 replies →
No Android phone I've ever owned automatically uploaded your photos without asking. What exactly do you mean that it does too?
Uninstall Google photos and install a dumb photos app. I think most android phones don't even come with Google photos pre installed.
3 replies →
uninstall(disable) stock Google Photos app and install `gallery2.apk`. You can download one from sketchy github repos, or I think you can alternatively extract from Emulator image.
1 reply →
What if you're a user and you're fed up with all the "magic"? What if you want a device that works reliably, consistently, and in ways you can understand from empirical observation if you pay attention?
Apple, Google, Microsoft and Samsung, they all seem to be tripping over each other in an effort to make this whole thing just as much ass-backwards as possible. Here is how it, IMHO, should work:
1) It scans stuff, detects faces and features. Locally or in the cloud or not at all, as governed by an explicit opt-in setting.
2) Fuck search. Search is not discoverable. I want to browse stuff. I want a list of objects/tags/concepts it recognized. I want a list of faces it recognized and the ability to manually retag them, and manually mark any that they missed. And not just a list of 10 categories the vendor thinks are most interesting. All of them. Alphabetically.
3) If you insist on search, make it work. I type in a word, I want all photos tagged with it. I click on a face, I want all photos that have matching face on it. Simple as that. Not "eventual consistency", not "keep refreshing, every 5th refresh I may show you a result", or other such breakage that's a staple experience of OneDrive Photos in particular.
Don't know about Apple, but Google, Microsoft and Samsung all refuse #2, and spectacularly fail at #3, and the way it works, I'm convinced it's intentional, as I can't even conceptualize a design that would exhibit such failures naturally.
EDIT:
4) (A cherry on a cake of making a sane product that works) Recognition data is stored in photo metadata, whether directly or in a sidecar file, in any of a bunch of formats sane people use, and is both exported along with the photos, and adhered to when importing new photos.
> What if you're a user and you're fed up with all the "magic"?
This is a completely hypothetical scenario. If users with such requirements actually existed, PinePhones and similar devices would be significantly more popular.
2 replies →
You can vote with your wallet and get a Pine Phone or something similar, I guess.
[dead]
Well, not vouching for automated scanning or whatever, but the advantage of homomorphic encryption is that besides the power usage for the computation and the bandwidth to transmit the data, Apple doesn't learn anything about what's in your photos, only you can. So even if you don't use the feature, the impact is minimal for you
So don’t use the photos app. Just get an alternative camera app and you bypass all of this.
It's opt-in by default so you can't "bypass" it unless you are aware that you can turn it off. If you don't turn it off, it will continue to scan your photos, and upload the data to Apple, whether you use the Photos app or not. (And, by the way, if the option to "learn from this app" is enabled (which is again, by default opt-in) iPadOS / ios also will be intrusively data collecting how you use that alternative camera app too ...
Its using Concrete from Zama.
I didn't like their license because it's BSD-3-Clause-Clear but then they state:
"Zama’s libraries are free to use under the BSD 3-Clause Clear license only for development, research, prototyping, and experimentation purposes. However, for any commercial use of Zama's open source code, companies must purchase Zama’s commercial patent license"
So Its not free, you need to pay for patent license, and they don't disclose how much.
I recommend OpenFHE as an alternative Free open source solution. I know its C++ and not Rust, but no patent license and it can do the same thing the blog post wants to do, it even has more features like proxy-reencryption that I think Concrete can't do.
How is this "BSD-licensed but only for research" not self-contradictory?
It's like saying: "FREE* candy! (Free to look at, eating is $6.95 / pound)"
They use the patent loophole. From https://www.zama.ai/post/open-source
> If a company open sources their code under BSD3-clear, they can sell additional licenses to use the patents included in the open source code. In this case, it still isn’t the software that is sold, but rather the usage rights of the patented intellectual property it contains.
Every day I like the Apache licence more.
BSD+"additional clause" is not BSD.
Just like 3+1 is not 3.
4 replies →
Concrete from Zama is a completely different algorithm to the one used in this product; the former uses the CKKS algorithm, while the latter is the BFV algorithm.
OpenFHE supports all major FHE schemes, including the BGV, BFV, CKKS, DM (FHEW), and CGGI (TFHE) schemes.
Copied from openfhe.org
Or maybe thats not what you meant...
Source? I'm unconvinced. They have been posting stuff about implementing HE primitives in Swift as of last year.
Zama has already hit competitors with (French) patent charges. Apple's HE implementation is in Swift and uses BFV, which is very different HE from anything Zama does and doesn't use their source.
1 reply →
> The two hops, the two companies, are already acting in partnership, so what is there technically in the relay setup to stop the two companies from getting together—either voluntarily or at the secret command of some government—to compare notes, as it were, and connect the dots?
The OHTTP scheme does not _technically_ prevent this. It increases the number parties need to cooperate to extract this information, hoping it would be caught somewhere in the pipeline.
and if a government say already forced ISPs to collect metadata about who connects to whom and when, I imagine they don't even need to bother getting data from the relay hosts
It's cool how neural networks, even convulutional ones, are one of the few applications that you can compute through homomorphic encryption without hitting a mountain of noise/bootstrapping costs. Minimal depth hurrhah!
I don't think Apple is doing this. They compute the embedding on device in the clear, and then just do the nearest-neighbor part in HE (which does lots of dot products but no neural network).
There are people doing NNs in HE, but most implementations do indeed require bootstrapping, and for that reason they usually use CKKS.
I was going to make my usual comment of FHE being nice in theory but too slow in practice, and then the article points out that there’s now SHE (somewhat homomorphic encryption). I wasn’t aware that the security guarantees of FHE could be relaxed without sacrificing them. That’s pretty cool.
Is there any concrete info about noise budgets? It seems like that’s the critical concern, and I’d like to understand at what point precisely the security breaks down if you have too little (or too much?) noise.
SHE vs FHE has nothing to do with security. Instead, it’s about how many operations (eg homomorphic multiplications and additions) can be performed before the correctness of the scheme fails due to too much noise accumulating in the ciphertext. Indeed all FHE schemes are also SHE schemes.
What typically makes FHE expensive computationally is a “bootstrapping” step for removing the noise that accumulated after X operations and threatening correctness. After bootstrapping you can do another X operations. Rinse and repeat until you finish the computation to you want to perform.
Im not an expert on this, but my understanding is "noise" is less a security breakdown and more the entire system breaksdown. That is where the "somewhat" comes in, unlike "full" where the system can do (expensive) things to get rid of noise, in somewhat the noise just accumulates until the system stops working. (Definitely talking out of my hat here)
Interesting! Are there any security concerns with SHE? If not, it sounds like all of the benefits of FHE with none of the downsides, other than the noise overwhelming the system. If that’s true, and SHE can run at least somewhat inexpensively, then this could be big. I was once super hyped about FHE till I realized it couldn’t be practical, and this has my old excitement stirring again.
3 replies →
it's incredibly algorithm-dependent. if you look into the thesis that originates the 'bootstrapping' technique to transform SHE algorithms into FHE, they determine the noise limit of their specific algorithm in section 7.3 and then investigate expanding the noise limit in 8 and 10.
(written in 2009) http://crypto.stanford.edu/craig/craig-thesis.pdf
some newer FHE don't encounter a noise limit or don't use the bootstrapping technique.
All known FHE schemes use bootstrapping
4 replies →
SHE doesn’t relax security guarantees, it relaxes the class of supported computations
Not sure whether they use FHE or not (the literature I’m looking at simply says “homomorphic”), but we use ElectionGuard at our company to help verify elections for our customers. So there are definitely some practical uses.
More here: https://www.swift.org/blog/announcing-swift-homomorphic-encr...
> One example of how we’re using this implementation in iOS 18, is the new Live Caller ID Lookup feature, which provides caller ID and spam blocking services. Live Caller ID Lookup uses homomorphic encryption to send an encrypted query to a server that can provide information about a phone number without the server knowing the specific phone number in the request.
Privacy by design is always nice to see.
I don't undertand how it can work. I assume the spam list is shared by all users, oherwise it will no be useful at all:
Let's supouse Apple is evil (or they recive an order from a judge) and they want to know who is calling 5555-1234
1) Add a new empty "spam" numbers encrypted database to the server (so there are now two encrypted databases in the system)
2) Add the encrited version of 5555-1234 to it.
3) When someone checks, reply the correct answer from the real database and also check in the second one and send the reply to the police.
4 replies →
There is another reason which I dislike this which is that now Apple has reason for "encrypted" data to be sent randomly or at least every time you take a picture. If in the future they silently change the photos app (a real risk that I have really emphasized in the past) they can now silently pass along a hash of the photo and noone would be the wiser.
If an iPhone was not sending any traffic whatsoever to the mothership, at least it would ring alarm bells if it suddenly started doing so.
Isn't this the same argument that they can change any part of the underlying OS and compromise the security by exfiltrating secret data? Why specific to this Photos feature.
No. GP means that if the app was not already phoning home then seeing it phone home would ring alarm bells, but if the app is always phoning home if you use it at all then you can't see "phoning home" as an alarm -- you either accept it or abandon it.
Whereas if the app never phoned home and then upon upgrade it started to then you could decide to kill it and stop using the app / phone.
Of course, realistically <.00001% of users would even check for unexpected phone home, or abandon the platform over any of this. So in a way you're right.
2 replies →
And which they silently do, change the applications. Maps has been updated for me via A/B testing. Messaging too.
Any app can do this really, just can’t update the entitlements and a few other things. I would think it unlawful for Apple’s own apps to have access to functionality/apis that others don’t…
I love homomorphic encryption, but why can't they just do a neural search locally?
- iOS Photos -> Vectors
- Search Query "Dog photos" -> Vectors
- Result (Cosine Similarity): Look some dog photos!
iPhones have plenty of local storage and compute power for doing this kind of thing when the phone is idle. And cosine similarity can work quickly at runtime.
Apparently they only do the cloud HE song and dance for landmarks, which is too big of a data set to realistically keep on-device.
I discuss this in the post:
> This seems like a lot of data the client is getting anyway. I don’t blame you for questioning if the server is actually needed. The thing is, the stored vectors that are compared against are by far the biggest storage user. Each vector can easily be multiple kilobytes. The paper discusses a database of 35 million entries divided across 8500 clusters.
Here's an open source iOS app[1] that does that. Incidentally, it's built using Apple's own MobileCLIP[2] models.
[1]: https://github.com/fguzman82/CLIP-Finder2 [2]: https://github.com/apple/ml-mobileclip
Because the blog post needs some sort of concrete example to explain, but all concrete examples of fully-homeomorphic encryption are generally done better locally at the moment due to the extreme costs of FHE.
That’s what they do
This would be even more exciting if there were some way to guarantee your phone, the servers, etc. are running untampered implementations, and that the proxies aren't colluding with Apple.
If someone or something can tamper with your phone, then nobody needs to collude with proxies or Apple. They can just ask your phone to send them exactly what they want, without all the homomorphic encryption dance.
The idea that Apple is going to use this feature to spy on you, completely misses the fact that they own the entire OS on your phone, and are quite capable of directly spying on you via your phone if they wanted to.
That is exactly the goal of Private Cloud Compute, part of the fabric of Apple Intelligence: https://security.apple.com/blog/private-cloud-compute/#:~:te....
Upgrades have to be possible. What you want probably is attestation that you're running a generally available version that other users run too as opposed to one specially made for you, but since a version could be made for all those subject to surveillance this wouldn't be enough either.
I'm not sure there's a way out of this that doesn't involve open source and repeatable builds (and watch out for Reflections on Trusting Trust).
I never even knew images could be searched this way on a phone and the iPhone users in my family don't either.
A huge privacy-bruising feature for nothing in our case.
I’ve been using it a lot recently. Multiple times even today while I’ve been trying to find just the right photos of my theater for a brochure I’m putting together. I have over 100,000 photos in Apple photos so even if I vaguely remember when I took a photo it’s still difficult to find it manually.
As a concrete example, someone on my team today asked me “can you send me that photo from the comedy festival a couple years ago that had the nice projections on the proscenium?”. I searched apple photos (on my phone, while hiking through a park) for “sketchfest theater projection”. It used the OCR to find Sketchfest and presumably the vector embeddings of theater and projection. The one photo she was referring to was the top result. It’s pretty impressive.
It can’t always find the exact photo I’m thinking of the first time, but I can generally find any vaguely-remembered photo from years ago without too much effort. It is pretty magical. You should get in the habit of trying it out, you’ll probably be pleasantly surprised.
Do you mean the ability to search in Apple Photos is “privacy-bruising”, or are you referring to landmark identification?
If the latter, please note that this feature doesn’t actually send a query to a server for a specific landmark — your device does the actual identification work. It’s a rather clever feature in that sense…
Is the Apple Photos feature mentioned actually implemented using Wally, or is that just speculation?
From a cursory glance, the computation of centroids done on the client device seems to obviate the need for sending embedded vectors of potentially sensitive photo details — is that incorrect?
I’d be curious to read a report of how on-device-only search (using latest hardware and software) is impacted by disabling the feature and/or network access…
According to this post on Apple's Machine Learning blog, yes, Wally is the method used for this feature.
https://machinelearning.apple.com/research/homomorphic-encry...
Thank you! This is exactly the information the OP seems to have missed. It seems to confirm my suspicion that the author’s concerns about server-side privacy are unfounded — I think:
> The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate…
[please correct me if I missed anything — this used to be my field, but I’ve been disabled for 10 years now, so grain of salt]
5 replies →
You have to be quick if you want to disable the feature, as the scan starts on OS install, and disabling requires you to actively open the Photos app and turn it off.
Is there any library for e.g. fhm hash table lookup or similar? I have seen papers but have not seen a consensus of what a useful implementatio is
OpenFHE could be what you looking for.
https://openfhe.org/
I'm not an expert in homomorphic encryption by any stretch (and I'm arguably the target audience for this blog post — a curious novice), but there's one thing I don't quite get from this post.
In the "appeal to cryptographers" section (which I really look forward to being fulfilled by someone, hopefully soon!), HE is equated to post-quantum cryptography. As far as I know, most current post-quantum encryption focuses on the elimination of Diffie-Hellman schemes (both over finite fields and over elliptic curves) since those are vulnerable to Shor's algorithm.
However, it's clear from the code samples later in the post (and not explained in the text, afaict) that a public key gets used to re-encrypt the resultant value of a homomorphic add or multiply.
Is this a case of false equivalence (in the sense that HE != post-quantum), or is it more the case that there's some new asymmetric cryptography scheme that's not vulnerable to Shor's?
All modern HE schemes rely on post-quantum crypto. For example, the ring-LWE problem used by BFV is the same as Kyber (ML-KEM) but with different parameters.
The twist in FHE is that the server also has an encryption of the user's secret key, which adds an assumption called "circular security", and that's needed to do some homomorphic operations like key switching.
Right on, thanks for the explanation!
So what gets called the "public key" in the blog post is just the (self?-)encrypted secret key from the user?
I'll read up on your other points after work -- appreciate the search ledes :)
1 reply →
There is an app that does the photo search locally on iPhones and it feels nothing short of magic: https://queryable.app/
This is a neat topic I want to get into more myself
Searching encrypted stuff is what I wondered about, in the past I had to decrypt everything before I could use the standard sql search LIKE
Funny post today about cosine similarity
You don't need to search much more to find out how to spy or track your boyfriend's phone. Given that males are more likely than women to cheat and that cheating rates are higher than ever, it makes sense that you would want to understand how to track your partner's phone. To spy on your lovers or anybody else, please mail her. Infocyberrecoveryinc@gmail.com
This is so cool! I first learned about homomorphic encryption in the context of an election cybersecurity class and it seemed so pie-in-the-sky, something that would unlikely be used for general practical purposes and only ever in very niche areas. Seeing a big tech company apply it in a core product like this really does feel like a step in the right direction towards taking back some privacy.
Don't they have TEE?
Doing it on device would require too much data, and TEEs have side-channel vulnerabilities making it difficult to deploy securely in prod.
oh, if Apple had a very simple setting to toggle off sending photos to iCloud, oh, wait!!!
> This should be fine: vectorization is a lossy operation. But then you would know that Amy takes lots of pictures of golden retrievers, and that is a political disaster.
This downplays the issue. Knowing that Alice takes lots of screenshots of Winnie the Pooh memes means that Alice’s family gets put into Xinjiang concentration camps, not just a political disaster.
(This is a contrived example: iCloud Photos is already NOT e2ee and this is already possible now; but the point stands, as this would apply to people who have iCloud turned off, too.)
Agreed. And for a less contrived example, people may have photos of political protests that they attended (and the faces of others present), screenshots that include sensitive messages, subversive acts, etc.
It's worth noting though that it's now possible to opt in to iCloud Photo e2ee with "Advanced Data Protection". [0]
[0] https://support.apple.com/en-us/102651
iCloud Photo e2ee still shares hashes of the plaintext with Apple, which means they can see the full list of everyone who has certain files, even if they all have e2ee enabled. They can see who had it first, and who got it later, and which sets of iCloud users have which unique files. It effectively leaks a social graph.
It’s also not available in China.
1 reply →
That's the joke/implication.
[dead]
[flagged]
This is about additional new photo search capabilities that are enabled by default and powered by sending (encrypted) data derived from your photos to the cloud, not locally powered AI search.
To be fair, it sounds like it's both. Local, AI powered 'neural hashing' + SHE
*Homomorphically encrypted. Be precise. Deliberately handwaving it away as "mere" encryption paints a worse picture than it actually is. Yes, possibly should have been opt-in, and a more human-friendly explanation of how it works would help. However, it's not nearly as bad as HN would have you believe. Very much a case of them pointing out that the emperor is naked, when in fact they have a few shirt buttons missing...
> There is no trust me bro element. > Barring some issue being found in the math or Apple’s implementation of it
Yes, is you bar the "trust me bro" element in your definition, you'll by definition have no such element.
Reality, though, doesn't care about your definition, so in reality this is exactly the "trust me bro" element that exists
> But we’re already living in a world where all our data is up there, not in our hands.
If that's your real view, then why do you care about all this fancy encryption at all? It doesn't help if everything is already lost
I mean if you'd like, you could reimplement the necessary client on an airgapped computer, produce an encrypted value, take that value to a networked computer, send it to the server, obtain an encrypted result that the server could not possibly know how to decrypt, and see if it has done the computation in question. No trust is required.
You could also observe all bits leaving the device from the moment you initialize it and determine that only encrypted bits leave and that no private keys leave, which only leaves the gap of some side channel at the factory, but you could perform the calculation to see that the bits are only encrypted with the key you expect them to be encrypted with.
How is this comment useful to the OPs valid arguments?