Comment by AuthAuth
9 hours ago
There are certain domains where we can accept a high degree of mistakes and health is not one of them. People are reacting they way they are because its obvious that LLMs are currently not reliable enough to be trusted to distrubute health advice. To me it doesnt matter that ChatGPT health sometimes gives good advice or some people _feel_ like it helped them. I'm not sure I even trust people when they say that given how much the LLM just affirms whatever they tell it.
I'm not saying LLMs don't make mistakes, I'm saying that some times get it right and so you shouldn't be surprised that you'll hear some positive stories.
But you don't even believe these are success stories then, so I guess you can never be convinced.
You'll hear some positive stories from people who think rocks heal them