Comment by jimmySixDOF

1 day ago

Also why the sudden interest? Amazon Alexa snips have been used before in court/investigation and this is not new. But makes me wonder about what happens when you are dealing with summaries of summaries of long gone tokens. Is that evidence?

> But makes me wonder about what happens when you are dealing with summaries of summaries of long gone tokens. Is that evidence?

There is text input and text output it's really not that complicated

If used in court the jury would be given access to the full conversation just like if it was an email thread

I suppose it's a good reminder to people that every cloud service they interact with is collecting data which can be used against them in court or in any number of other ways at any point in the future and that chatbots are no exception.

I'm sure that there are many people who thoughtlessly type very personal things into chatgpt including things that might not look so good for them if they came out at trial.

> Also why the sudden interest? Amazon Alexa snips have been used before in court/investigation and this is not new.

As I understand it, some people treat chatgpt like a close personal friend and therapist. Confiding their deepest secrets and things like that.

  • Is this any different than people asking their deepest darkest questions to google?

    • It is very different. In one case you actively have to prod a "neutral" machine to get your dark curiosity satiated, in the other case the machine is designed to draw it out of you.

      Same difference as: "Allowing minors into casinos... is it any different from letting them play cards with their friends at home with their pocket money?"

      1 reply →

You have your full history in chatgpt not just summaries and I doubt they permanently delete chats you specifically choose to delete.

  • For ChatGPT, they're under legal obligation not to delete chats for a period of time.

    https://openai.com/index/response-to-nyt-data-demands/ (yes, that's written 100% from OpenAI's perspective)

    In particular:

    > The New York Times is demanding that we retain even deleted ChatGPT chats and API content that would typically be automatically removed from our systems within 30 days.

    > ...

    > This data is not automatically shared with The New York Times or anyone else. It’s locked under a separate legal hold, meaning it’s securely stored and can only be accessed under strict legal protocols.

    > ...

    > Right now, the court order forces us to retain consumer ChatGPT and API content going forward. That said, we are actively challenging the order, and if we are successful, we’ll resume our standard data retention practices.

  • I think they were referring to intermediate tokens in "Thinking" models, which are summarized in the interface but ultimately discarded (and may themselves be summaries of sources, other chats, or earlier intermediate states).

    Presumably what's of evidentiary value is the tokens you type, though.