← Back to context

Comment by tgtweak

15 hours ago

Much like talking to your doctor - you need to ask/prompt the right questions. I've seen chatgpt and gemini make one false assumption that was never mentioned and run with it and continue referencing it down the line as if it were fact... That can be extremely dangerous if you don't know enough to ask it to reframe or verify, or correct it's assumption.

If you are using it like a tool to review/analyze or simplify something - ie explain risk stratification for a particular cancer variant and what is taken into account, or ask it to provide probabilities and ranges for survival based on age/medical history, it's usually on the money.

Every other caveat mentioned here is valid, and it's valid for many domains not just medical.

I did get hemotologist/oncologist level advice out of chatgpt 4o based on labs, pcr tests and symptoms - and those turned out to be 100% true based on how things panned out in the months that followed and ultimately the treatment that was given. Doctors do not like to tell you the good and the bad candidly - it's always "we'll see what the next test says but things look positive" and "it could be as soon as 1 week or as long as several months depending on what we find" when they know full well you're in there for 2 months at minimum you're a miracle case. Only once cornered or prompted will they give you a larger view of the big picture. The same is true for most professional fields.