Comment by vharish

3 days ago

Even so, it's rather common for doctors to not be albe to diagonise correctly. It's a guessing game for them too. I don't know so much about US but it's a real problem in large parts of the world. As the comment stated, I would take anything a doctor says with a pinch of salt. Particularly so when the problem is not obvious.

These things are not equivalent.

This is really not that far off from the argument that "well, people make mistakes a lot, too, so really, LLMs are just like people, and they're probably conscious too!"

Yes, doctors make mistakes. Yes, some doctors make a lot of mistakes. Yes, some patients get misdiagnosed a bunch (because they have something unusual, or because they are a member of a group—like women, people of color, overweight people, or some combination—that American doctors have a tendency to disbelieve).

None of that means that it's a good idea to replace those human doctors with LLMs that can make up brand-new diseases that don't exist occasionally.