Comment by kccqzy
3 years ago
You missed the widely panned Apple iCloud child sexual abuse imagery detection feature. The private set intersection is basically doing homomorphic encryption. In raising some very valid policy critiques, people forget that it's actually a nifty piece of engineering. (This is not an endorsement of that feature.) https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...
I'm also closely working together with a team at $WORK who's using a protocol very similar to Apple's but not doing CSAM detection. We are seeing some severe pushback on this technology. I wouldn't be surprised if there are multiple homomorohic encryption based products at Big Tech that have yet to see the light of day.
I think it is very counterintuitive---even to professional programmers---that you can compute things without letting the computer know them. I literally had to go through the entire paper to understand that this can actually work as long as human in the loop doesn't screw up (see my summary [1], which was revised multiple times at that time).
[1] https://news.ycombinator.com/item?id=28223141