Comment by paxys

2 days ago

No need to make up hypotheticals. The server isn't the final destination for your LLM requests. The reply needs to come back to you.

If Bob and Alice are in an E2EE chat Bob and Alice are the ends. Even if Bob asks Alice a question and she replies back to Bob, Alice is still an end.

Similarly with AI. The AI is one of the ends of the conversation.

  • So ChatGPT is end-to-end encrypted?

    • No, because there is a web server that exposes an API that accepts a plaintext prompt and returns plaintext responses (even if this API is exposed via TLS). Since this web server is not the same server as the backend systems that are processing the prompt, it is a middle entity, rather than an end in the system.

      The difference here is that the web server receiving a request for Confer receives an encrypted blob that only gets decrypted when running in memory in the TEE where the data will be used, which IS an end in the system.

    • Is your point that TLS is typically decrypted by a web server rather than directly by the app the web server forwards traffic to?