Comment by BlackFly
1 month ago
This discussion got quite long without anyone mentioning the novel technical implementation paper "Scalable Private Search with Wally". Kudos to Apple for researching this, but shame on them for enabling this by default.
As a somewhat homomorphic encryption scheme, each query releases some bits of information on what you searching for to avoid using the whole database. Subsequent queries from a given user will generally be correlated enough to build a tracer. Governments or other powerful enough actors can pierce the proxy veil, heck the tracer will be able to deanonymize you with enough queries recorded.
How many queries? For me it is too tedious to work out the math from the differential privacy definitions and I already know the landmarks around me: I don't want such a feature.
Hi,
Very curious here as I haven’t seen any papers demonstrating attacks against the differential privacy systems proposed by Apple or Google that successfully deanonymize data. Such an attack in even a test database would be very interesting.
Do you have any papers you can cite about this entropic leakage you’re describing?
The very difference between somewhat and full homomorphic encryption hinges on this leakage as explained in the paper. The definition of differential privacy as well. They directly admit to leaking a certain amount of information by stating that they apply differential privacy with those given parameters. The issue I am talking about is that such concerns are applied on a single query but correlations across query (the things that actually happen with metadata) aren't considered in the delta-epsilon differential privacy model, by definition.
So if you are curious, just read up in general about differential privacy, or this classic: https://gwern.net/death-note-anonymity Discussed here (possibly in other places too): https://news.ycombinator.com/item?id=9553494