Comment by entropi

7 days ago

I think an even more important question is this: "do we trust Sam Altman (and other people of his ilk) enough to give the same level of personal knowledge I give to my therapist?".

E.g. if you ever give a hint about not feeling confident with your body, it could easily take this information and nudge you towards certain medical products. Or it could take it one step further, and nudge towards more consuming more sugar and certain medical products at the same time, seeing that it moves the needle even more optimally.

We all know the monetization pressure will come very soon. Do we really advocate for giving this kind of power to these kinds of people?

I feel it's worth remembering that there are reports that Facebook has done almost exactly this in the past. It's not just a theoretical concern:

> (...) the company had crafted a pitch deck for advertisers bragging that it could exploit "moments of psychological vulnerability" in its users by targeting terms like "worthless," "insecure," "stressed," "defeated," "anxious," "stupid," "useless," and "like a failure."

https://futurism.com/facebook-beauty-targeted-ads

Some (most?) therapists use tools to store notes about their patients - some even store the audio/transcripts. They're all using some company's technology already. They're all HIPPA certified (or whatever the appropriate requirement is).

There's absolutely no reason that LLM providers can't provide equivalent guarantees. Distrusting Sam while trusting the existing providers makes little sense.

BTW, putting mental health aside, many doctors today are using LLM tools to record the whole conversation with the patient and provide good summaries, etc. My doctor loves it - before he was required to listen to me and take notes at the same time. Now he feels he can focus on listening to me. He said the LLM does screw up, but he exists to fix those mistakes (and can always listen to the audio to be sure).

I don't know which company is providing the LLM in the backend - likely a common cloud provider (Azure, Google, etc). But again - they are fully HIPPA certified. It's been in the medical space for well over a year.

  • I don't know how it works in the US, but where I live; patient privacy is actually a thing and they cannot record your sessions or store the recordings and analyze them at arbitrary places without your explicit consent. Even if we accept that this is OK, I would argue there are other problems. Like;

    * Copyright laws worked and protected creator's rights. Until AI companies decided that it is ok to pirate terabytes of books. One day they might decide that using your therapy records to train/finetune their models is also ok. And you surely wont have the same resources as major publishers to fight against it.

    * Even if you trust HIPPA or whatever to be useful and actually protect your rights, current chatgpt has no such certification (as far as I am aware).

    * When you give your data, its no longer in your control. Even if you get extremely lucky and laws actually work, and companies actually play along; in, say, ten years things may change and suddenly your old records may be fair game. Its out of your hands.

    It does not make sense to me.

    • > patient privacy is actually a thing and they cannot record your sessions or store the recordings and analyze them at arbitrary places without your explicit consent.

      So amongst the many papers you sign when you see a therapist/doctor is one where you provide consent.

      > And you surely wont have the same resources as major publishers to fight against it.

      HIPPA violations are taken much, much more seriously than your typical copyright violation.

      > Even if you trust HIPPA or whatever to be useful and actually protect your rights, current chatgpt has no such certification (as far as I am aware).

      The way is to use GPT via Azure, and ensure Azure's method of providing GPT access is HIPPA compliant. Azure runs GPT locally and doesn't send to OpenAI (or at least my company has such an agreement with MS). You can fill out a form asking Azure not to store any transcript - even for security/performance reasons.

      Besides Azure, there are dedicated companies providing LLM access for healthcare where the companies are HIPPA certified and can provide all the guarantees (even if they use 3rd party LLMs). It's a simple Google search.

      > When you give your data, its no longer in your control. Even if you get extremely lucky and laws actually work, and companies actually play along; in, say, ten years things may change and suddenly your old records may be fair game. Its out of your hands.

      As I pointed out, LLMs are not unique about this problem. What you say is true for any tool used by therapists for record keeping.