Comment by BeetleB
6 days ago
> patient privacy is actually a thing and they cannot record your sessions or store the recordings and analyze them at arbitrary places without your explicit consent.
So amongst the many papers you sign when you see a therapist/doctor is one where you provide consent.
> And you surely wont have the same resources as major publishers to fight against it.
HIPPA violations are taken much, much more seriously than your typical copyright violation.
> Even if you trust HIPPA or whatever to be useful and actually protect your rights, current chatgpt has no such certification (as far as I am aware).
The way is to use GPT via Azure, and ensure Azure's method of providing GPT access is HIPPA compliant. Azure runs GPT locally and doesn't send to OpenAI (or at least my company has such an agreement with MS). You can fill out a form asking Azure not to store any transcript - even for security/performance reasons.
Besides Azure, there are dedicated companies providing LLM access for healthcare where the companies are HIPPA certified and can provide all the guarantees (even if they use 3rd party LLMs). It's a simple Google search.
> When you give your data, its no longer in your control. Even if you get extremely lucky and laws actually work, and companies actually play along; in, say, ten years things may change and suddenly your old records may be fair game. Its out of your hands.
As I pointed out, LLMs are not unique about this problem. What you say is true for any tool used by therapists for record keeping.
No comments yet
Contribute on Hacker News ↗