← Back to context

Comment by nerdjon

8 hours ago

This screams just as genuine as Google saying anything about Privacy.

Both companies are clearly wrong here. There is a small part of me that kinda wants openai to loose this, just so maybe it will be a wake up call to people putting in way too personal of information into these services? Am I too hopeful here that people will learn anything...

Fundamentally I agree with what they are saying though, just don't find it genuine in the slightest coming from them.

Its clearly propaganda. "Your data belongs to you." I'm sure the ToS says otherwise, as OpenAI likely owns and utilizes this data. Yes, they say they are working on end-to-end encryption (whatever that means when they control one end), but that is just a proposal at this point.

Also their framing of the NYT intent makes me strongly distrust anything they say. Sit down with a third party interviewer who asks challenging questions, and I'll pay attention.

  • "Your data belongs to you" but we can take any of your data we can find and use it for free for ever, without crediting you, notifying you, or giving you any way of having it removed.

    • It's owned by you but OpenAi has a "perpetual, irrevocable, royalty-free license" to use the data as they see fit.

    • Wow it's almost like privately-managed security is a joke that just turns into de-facto surveillance at-scale.

  • >your data belongs to you

    …”as does any culpability for poisoning yourself, suicide, and anything else we clearly enabled but don’t want to be blamed for!”

    Edit: honestly I’m surprised I left out the bit where they just indiscriminately scraped everything they could online to train these models. The stones to go “your data belongs to you” as they clearly feel entitled to our data is unbelievably absurd

    • >…”as does any culpability for poisoning yourself, suicide, and anything else we clearly enabled but don’t want to be blamed for!”

      Should walmart be "culpable" for selling rope that someone hanged themselves with? Should google be "culpable" for returning results about how to commit suicide?

      14 replies →

I got one sentence in and thought to myself, "This is about discovery, isn't it?"

And lo, complaints about plaintiffs started before I even had to scroll. If this company hadn't willy-nilly done everything they could to vacuum up the world's data, wherever it may be, however it may have been protected, then maybe they wouldn't be in this predicament.

Ironically there is precedent of Google caring more about this. When they realized location timeline was a gigantic fed honeypot, they made it per-device, locally stored only. No open letters were written in the process of.

Honestly the sooner OpenAI goes bankrupt the better. Just a totally corrupt firm.

  • I really should take the "invest in companies you hate" advice seriously.

    • I don't hate them. It is just plain to see they have discovered no scalable business model outside of getting larger and larger amounts of capital from investors to utilize intellectual property from others (either directly in the model aka NYT, or indirectly via web searches) without any rights. It is better for all of us the sooner this fails.