← Back to context

Comment by gretch

8 hours ago

Every other source for information, including (or maybe especially) human experts can also make mistakes or hallucinate.

The reason ppl go to LLMs for medical advice is because real doctors actually fuck up each and everyday.

For clear, objective examples look up stories where surgeons leave things inside of patient bodies post op.

Here’s one, and there many like it.

https://abc13.com/amp/post/hospital-fined-after-surgeon-leav...

"A few extreme examples of bad fuck ups justify totally disregarding the medical profession."

  • "Doing your own research" is back on the menu boys!

    • I'll insist the surgeon follows ChatGPTs plan for my operation next time I'm in theatre.

      By the end of the year AI will be actually doing the surgery, when you look at the recent advancements in robotic hands, right bros?