Comment by azan_
1 day ago
> Dystopian and frankly, gross. Its amazing to me that so many people are willing to give up control over their lives and in this case, their bodies, for the smallest inkling of ease.
I've read people with chronic conditions reporting that chatgpt actually helped them land correct diagnosis that doctors did not consider so people are not just using that for "inkling of ease".
Yes, trading your privacy and autonomy for perceived ease is how they are going to steal your future and your freedom.
Please read my comment again. If you lived with chronic pain that multiple doctors failed to correctly diagnose and ChatGPT actually suggested correct diagnosis then you wouldn’t call it just perceived ease, but something that made your life much, much better. I’m doctor and I’m all for empowering patients (as long as they consult ChatGPT output with actual doctors). It’s very easy to criticize people resorting to llms if you do not have any rare debilitating condition that’s not correctly diagnosed.
With all due respect, you are thinking like a good person, a human being who spent decades of their life to learn how to care for people. You took a pledge to Do-no-Harm. You are looking at these tools as tools.
The owners and future owners of said data do not care about anything other than profits and exploitation. They do not care about the patient, the doctor let alone the consequences of their doings. They took a pledge to make-profits regardless of the harm. A position fundamentally opposed to that of the medical doctor.
5 replies →
What they seem to be saying is “this is how they get you,” which I agree with. Whether or not it’s immensely helpful is not being debated. There’s a very serious cost no matter what.
Genuinely curious, what happens to me if the wrong people know about my chronic back pain and GERD?