← Back to context

Comment by alterom

14 hours ago

>At the same time, do you really want every conversation you have with your doctor recorded

Yes. This is what medical records are. They've been kept by doctors for a reason.

It's not like the doctor is talking to you about which anime series are the best. You're talking about your health, your body, your disease, your treatment.

It's important to keep track of that.

>Plus what doctor has time to sit down and re-listen to your visit to check to make sure the AI didn't screw up at some point in the future anyway

No doctor.

Which is why it really should be their (or their assistant's) job to record the relevant parts of the conversations.

>At what point does it become a larger waste of time and money to babysit an incompetent AI than just not using one in the first place?

At this point, as the audit shows.

Except the industry (both the AI vendors and healthcare) are going YOLO¹ and relying on AI anyway.

>There are some good uses for AI, but I'm not convinced that this (or many other cases where accuracy matters) is one of them.

This has always been the case, but the marketing now has reached of point of gaslighting in trying to make people collectively forget that or pretend that it's not the case.

Once hard evidence is presented (like in this case), the defense is invariably that it's a temporary quality issue that's going to be resolved as the AI improves Any Day Now™, and that it's wise to live as if it were the case already² (and everyone who disagrees is a fool that Will Be Left Behind™).

The level of fervor in this rhetoric gives me an impression that the flaw is so fundamental that it won't be fixed in any form of AI based on today's technologies, that the AI vendor leadership knows this, and that the entire industry is, at this point, is a grand pump-and-dump scheme.

I hope I'm wrong.

____

¹ See, you only live once. But there are millions of you. So, like, whatever if you don't. Something something economies of scale to them.

² This is called a phantasm.

> Yes. This is what medical records are. They've been kept by doctors for a reason.

Not every conversation. Historically, one of the nice things about doctors is that they're the ones filtering what gets included in your medical record. They decide what is medically relevant and what can remain confidential. Doctors understand that not everything discussed needs to be included in your file. Sometimes that really is just small talk, sometimes it's even medical concerns, questions, or requests for advice and still not all of it needs to go into your file and much of it would only clutter it up anyway.

Any system that stores an entire visit as audio or video long into the future (much easier/temping to do in telehealth settings) is a terrible system. "We may one day need to be able to verify if what AI wrote is real" is a terrible reason to change that.

Doctors (and increasingly patients) understand that a medical record can remain for your entire life. It will probably be seen by many different people within that time for valid reasons but medical records also get leaked/stolen/sold/illegally accessed. Patients need to be able to speak freely with their doctors and often depend on their discretion. Knowing that your every word will be recorded and kept in case somebody 10 years later has a question about what AI wrote in your file could keep people from being open and honest with their doctors.

> Except the industry (both the AI vendors and healthcare) are going YOLO¹ and relying on AI anyway.

Unless we get strong regulations to prevent it I'm afraid that you're right and that this is going to be a problem we experience in a lot of industries and areas besides healthcare. We see it happening in the justice system for example and it's already ruining people's lives.

  • >Not every conversation. Historically, one of the nice things about doctors is that they're the ones filtering what gets included in your medical record.

    We're in complete agreement here.

    If we're not talking about an audio/video recording (a thing that nobody needs), the act of producing a record of a conversation involves choosing what goes into it.

    We both agree that not every words that was said needs to go there. By far.

    I guess it would be correct to say that there needs to be a record of every medical visit, but nobody needs a recording.

>>At the same time, do you really want every conversation you have with your doctor recorded

>Yes. This is what medical records are.

No. Medical records are limited extracts from conversations, which is your doctor and only your doctor is qualified to make, using "semantic analysis applied to your unique situation", not "linguistic probabilistic inference applied to conversation about your situation using token weights averaged over billion unrelated samples"

> It's not like the doctor is talking to you about which anime series are the best. You're talking about your health, your body, your disease, your treatment.

No jokes, no banter, no chit-chat, no complements to doctor's new Tesla?

> It's important to keep track of that.

Same fallacy Meta fell into when started tracking employees' keystrokes and mouse gestures. 90% of my mouse movements are just fidgeting, with no relation to the task at hand - and it is not a crime! But if I knew my mouse fidgeting is being watched, I'll make sure that percentage goes up to 99% - for the LLM which is gonna be trained off it to self-immolate over its NSFW nature.

  • Hey, I'm agreement with you.

    I meant that these limited extracts do need to be recorded, that's all.

    Read the rest of the comment :)

    • Oops... I am deeply sorry, thank you for the heads up! It seems I've myself committed a cardinal sin that I am usually quick to point in others - rushing to reply without comprehending the full message. (Meta-oops: I realized how LLM-ish it sounds. Quick, reboot before my cover is blown!)

      I happen to believe that the flaw being discussed IS fundamental and inherent in the design and architecture of LLM - this is why I always put "AI" in scare quotes. I've spoken about it in some of my other comments, namely this https://news.ycombinator.com/item?id=48046333. And as you do, I, too, hope that I am wrong about the hype and its eventual clash with reality, but do not hold my breath.