← Back to context

Comment by oneplane

1 month ago

Yep. It's essentially an implementation of remote attestation "the other way around". Normally the edge device is untrusted and needs to attest a variety of things before compute is done and the result is accepted, but PCC is the other way where the edge device holds the keys (technically octagon works that out, but it's backed by the on-device SEP).

So it does it multiple ways:

- Finite sets and added noise, doesn't hurt performance too much but does make it nearly impossible to ID/locate a photo

- Encryption at rest and in transit

- Transit over hops they don't own

- Homomorphic Encryption during remote compute

The data it finds was available in two ways: the "result" and the vector embedding. Not sure which one you end up consuming since it also has to work on older models that might not be able to load the embeddings and perform adequately, but it doesn't matter since the data itself will be unique so you can't do parallel reconstruction, but it is also uniquely meaningless to anyone without a key. They are essentially computing on needles that aren't in a haystack, but in a needle stack.

The primitives all this is built on have been around for quite a while, including their HBONE implementation, the cryptographically hidden data distribution and the SEP. So far, it has been the only one of its kind outside of disjointed options like buying and operating your own HSM, a large TOR network and a yet to-be-invented self-hosted PCC solution (AMD was supposed to release something but they failed at that, just not as bad as Intel messed up with SGX).

Technically, even with everything else removed, just some good TLS 1.2+ and homomorphic encryption would have been more than any other mass market manufacturer has ever done in an effective way. But by adding the additional factors such as degrees of separation so they couldn't get in themselves (without breaking it for everyone in the process) is what makes this so much more robust.