← Back to context

Comment by vrtx0

4 days ago

Is the Apple Photos feature mentioned actually implemented using Wally, or is that just speculation?

From a cursory glance, the computation of centroids done on the client device seems to obviate the need for sending embedded vectors of potentially sensitive photo details — is that incorrect?

I’d be curious to read a report of how on-device-only search (using latest hardware and software) is impacted by disabling the feature and/or network access…

According to this post on Apple's Machine Learning blog, yes, Wally is the method used for this feature.

https://machinelearning.apple.com/research/homomorphic-encry...

  • Thank you! This is exactly the information the OP seems to have missed. It seems to confirm my suspicion that the author’s concerns about server-side privacy are unfounded — I think:

    > The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate…

    [please correct me if I missed anything — this used to be my field, but I’ve been disabled for 10 years now, so grain of salt]

You have to be quick if you want to disable the feature, as the scan starts on OS install, and disabling requires you to actively open the Photos app and turn it off.