Comment by gretch
11 days ago
Every other source for information, including (or maybe especially) human experts can also make mistakes or hallucinate.
The reason ppl go to LLMs for medical advice is because real doctors actually fuck up each and everyday.
For clear, objective examples look up stories where surgeons leave things inside of patient bodies post op.
Here’s one, and there many like it.
https://abc13.com/amp/post/hospital-fined-after-surgeon-leav...
[flagged]
Please don't use quotation marks to make it look like you're quoting someone when you aren't. That's an internet snark trope, and we're trying to avoid that kind of thing here.
You're welcome to make your substantive points thoughtfully, of course.
https://news.ycombinator.com/newsguidelines.html
Yup make up something I didn't say to take my argument to a logical extreme so you can feel smug.
"totally disregard"
yeah right, that's what I said
"Doing your own research" is back on the menu boys!
I'll insist the surgeon follows ChatGPTs plan for my operation next time I'm in theatre.
By the end of the year AI will be actually doing the surgery, when you look at the recent advancements in robotic hands, right bros?