Comment by BeetleB

7 days ago

> Feeding patient files and clinical notes into a training set violates so many ethical and legal rules;

> How will the "LLM" keep track of 45 minutes worth of notes per session? Do you have any idea how much writing is involved? treatment plans, session notes, treatment team notes, nevermind the other overheads.

It sounds like you're asking this as a hypothetical, when in fact this has been a reality for well over a year (while following all the legal requirements). From another comment of mine:

"BTW, putting mental health aside, many doctors today are using LLM tools to record the whole conversation with the patient and provide good summaries, etc. My doctor loves it - before he was required to listen to me and take notes at the same time. Now he feels he can focus on listening to me. He said the LLM does screw up, but he exists to fix those mistakes (and can always listen to the audio to be sure).

I don't know which company is providing the LLM in the backend - likely a common cloud provider (Azure, Google, etc). But again - they are fully HIPPA certified. It's been in the medical space for well over a year."

and each session is important to future sessions. All you've shown is that it is possible to do speech to text. And the claim is that it will save the audio ... for how long? forever? what a privacy nightmare.

I've literally done a POC that you can do therapeutic LLM, where the user journals once a day, a few sentences. After a couple of months, the context grows to the point that the LLM starts screwing up when reading the entire context. It hallucinates things that happened that didn't, it starts changing it's "feelings" on past events in a way that doesn't make therapeutic sense.

there's no way around this, currently.

so re-read what i wrote, because i said everything i meant to, and needed to say.