← Back to context

Comment by rconti

1 day ago

It can be helpful, but also untrustworthy.

My mother-in-law has been struggling with some health challenges the past couple of months. My wife (her daughter) works in the medical field and has been a great advocate for her mother. This whole time I've also been peppering ChatGPT with questions, and in turn I discuss matters with my wife based on this.

I think it was generally correct in a lot of its assertions, but as time goes on and the situation does it improve, I occasionally revisit my chat and update it with the latest results and findings, and it keeps insisting we're at a turning point and this is exactly what we should expect to be happening.

6 weeks ago, I think its advice was generally spot on, but today it's just sounding more tone-deaf and optimistic. I'd hate to be _relying_ on this as my only source of advice and information.

Totally agree, it can be a bit of an echo chamber. I had an infection post-dental-work. Bing Chat insisted I had swollen lymph nodes from a cold that would resolve on their own, then decided I had a salivary gland infection. After a follow-up with a real-world ENT, it was (probably accurately) diagnosed as a soft-tissue infection that had completely resolved on two rounds of antibiotics. The AI never raised that possibility, whereas the ENT and dentist examined me and reached that conclusion immediately.

I do think AI is great for discussing some health things (like "how should I interpret this report or test result?"), but it's too echo chamber-y and suggestion-prone for accurate diagnosis right now.

  • Ya I wouldn't trust it for diagnosis at this point. But it can help you get pointed in the right direction so human, tests, and the scientific process can try to figure out the rest.

    Doctors struggle with diagnosis as well. I have stories and I bet everyone has stories about being passed from doctor to doctor to doctor, and none of them talk to each other or work holistically.