← Back to context

Comment by satellite2

7 days ago

"All hardware is located in the United States."

If I use local/OSS models it's specifically to avoid running in a country with no data protection laws. It's a big close miss here.

I think what matters more here is "All hardware is located outside of China". Located in the US means little because that's not good enough for many regulated industries even within the US.

All things considered though, Europe is getting confusing. They have GDPR but now pushing to backdoor encryption within the EU? [1]

At least there isn't a strong movement in the US trying to outlaw E2E encryption.

[1] https://www.eff.org/deeplinks/2025/06/eus-encryption-roadmap...

Which brings up the point are truly private LLMs possible? Where the input I provide is only meaningful to me, but the LLM can still transform it without gaining any contextual value out of it? Without sharing a key? If this can be done, can it be done performantly?

  • Maybe I hit a nerve with the EU part? I thought it was a fair observation, but I'm open to being corrected if there's more nuance I missed.

    • The bill has been stalled since 2022.

      Yes, there is gonna be a new discussion for it on October 15, but I've already seen section of governments being against their own government position on the bill (Swedish Military for example).

No I think the point is to choose the best jurisdiction to have cloud hosted data where your data is best protected from access by very wealthy entities via intelligence services bribery. That’s still hands down the USA.

  • Any evidence for this claim that e.g. Mossad has less penetration into digital systems of USA than it does RF or PRC?

    • They might have access to any given machine, but they lack the broad scope of general surveillance. If they want to get you, just like most of the other nation state level threats, you will get got. For other threat models, the US works pretty well.

      I guarantee that nobody cares about or will be surveilling your private AI use unless you're doing other things that warrant surveillance.

      The reason big providers suck, as OpenAI is so nicely demonstrating for us, is that they retain everything, the user is the product, and court cases, other situations can unmask and expose everything you do on a platform to third parties. This country seriously needs a digital bill of rights.

      1 reply →