Comment by eqvinox
1 day ago
They maybe (not taking a stance) shouldn't, but I don't think this argument is as simple as one thinks. Doing surveillance on someone's home generally requires a court order beforehand. And depending on the country (I don't believe this applies to the US), words spoken at home also enjoy extended legal protection, i.e. they can't subpoena a friend you had a discussion with.
Now the real question is, do you consider it a conversation or a letter. Any opened¹ letters you have lying around at home can be grabbed with a court-ordered search warrant. But a conversation—you might need the warrant beforehand? It's tricky.
(Again, exact legal situation depends on the country.)
¹ Secrecy of correspondence frequently only applies to letters in sealed envelopes. But then you can get another warrant for the correspondence…
Honest question, why consider the personal home, letters or spoken words at all, considering most countries around the world already have ample and far more applicable laws/precedent for cloud hosted private documents?
For the LLM input, that maps 1:1 to documents a person has written and uploaded to cloud storage. And I don't see how generated output could weigh into that at all.
A simple answer to this is: I use local storage or end-to-end encrypted cloud backup for private stuff, and I don't for work stuff. And I make those decisions on a document-by-document basis, since I have the choice of using both technologies.
The question you are asking is: should I approach my daily search tasks with the same degree of thoughtfulness and caution that I do with my document storage choices, and do I have the same options? And the answers I would give are:
* As a consumer I don't want to have to think about this. I want to be able to answer some private questions or have conversations with a trusted confidant without those conversations being logged to my identity.
* As an OpenAI executive, I would also probably not want my users to have to think about this risk, since a lot of the future value in AI assistants is the knowledge that you can trust them like members of your family. If OpenAI can't provide that, something else will.
* As a member of a society, I really do not love the idea that we're using legal standards developed for 1990s email to protect citizens from privacy violations involving technologies that can think and even testify against you.
> [...] should I approach my daily search tasks with the same degree of thoughtfulness and caution that I do with my document storage choices [...]
Then treat them with the same degree of thoughtfulness and caution you have treated web searches on Google, Bing, DuckDuckGo or Kagi for the last decade.
Again, there is no confidant or entity here, no more so than the search algorithms we have been using for decades are at least.
> I really do not love the idea that we're using legal standards developed for 1990s email to protect citizens [...]
Fair, but again, that is in no way connected to LLMs. I still see no reason presented why LLM input should be treated any differently to cloud hosted files or web search requests.
You want better privacy? Me too, but that is not in any way connected to or changed by LLMs being common place. Same logic I find any attempt to restrict a specific social media company for privacy and algorithmic concerns laughable, if the laws remain so that any local competitors are allowed to do the same invasions.
4 replies →