← Back to context

Comment by DudeOpotomus

1 day ago

Yes, trading your privacy and autonomy for perceived ease is how they are going to steal your future and your freedom.

Please read my comment again. If you lived with chronic pain that multiple doctors failed to correctly diagnose and ChatGPT actually suggested correct diagnosis then you wouldn’t call it just perceived ease, but something that made your life much, much better. I’m doctor and I’m all for empowering patients (as long as they consult ChatGPT output with actual doctors). It’s very easy to criticize people resorting to llms if you do not have any rare debilitating condition that’s not correctly diagnosed.

  • With all due respect, you are thinking like a good person, a human being who spent decades of their life to learn how to care for people. You took a pledge to Do-no-Harm. You are looking at these tools as tools.

    The owners and future owners of said data do not care about anything other than profits and exploitation. They do not care about the patient, the doctor let alone the consequences of their doings. They took a pledge to make-profits regardless of the harm. A position fundamentally opposed to that of the medical doctor.

  • What they seem to be saying is “this is how they get you,” which I agree with. Whether or not it’s immensely helpful is not being debated. There’s a very serious cost no matter what.

Genuinely curious, what happens to me if the wrong people know about my chronic back pain and GERD?