← Back to context

Comment by DudeOpotomus

1 day ago

Dystopian and frankly, gross. Its amazing to me that so many people are willing to give up control over their lives and in this case, their bodies, for the smallest inkling of ease.

The only thing you have control of in this world is your body (men only, women have already been denied body autonomy in the US), so giving this to the very entities that "do harm" as opposed to those who pledge to "do-no-harm", is straight up bonkers.

It's not the data or the use of said data for the intended purpose. There is a law of sorts in life that says what ever they promise, it will be broken. The data and its intended purpose will be perverted and ultimately used as a weapon against the very people who provided the data.

Blah blah blah.

I have a more niche genetic issue and I'm glad for you that you can think like this but no one cares enough to do the proper research for my problem.

If ml, massive compute, Google/chatgpt health do something in this direction (let's be honest anything) I'm glad for it.

  • You will be denied coverage and treatment because you volunteered your personal data with zero controls over its use and your rights. \

    • Denied by who? Most developed nations will not, including the US. Either way, that’s a separate problem solved by legislation or increased wealth transfers. Denying people information doesn’t help.

      1 reply →

  • Do you believe that ChatGPT is doing the the research? I'm all in favor of better access and tools to research but at least in the US all of the research is being defunded, we're actively kicking researchers out of the country, and a bunch of white billionaires are proposing this as an alternative, based on training data they won't share.

    This is a product feature that invalidates WebMD and the like. It does not solve any health problems.

The LLM still provide value. They are much quicker than seeing a doctor, and with Deep Research for ChatGPT and whatever Gemini google search is calling it now you can actually get to see the sources from the information that it is looking at.

Parsing 100 different scientific articles or even google search results is not going to be possible before I get bored and move on. This is the value of LLM.

Even if the LLM data is used in training or sold off one way to protect oneself, is to add in knowingly incorrect data to the chat. You know it is incorrect, the LLM will believe it. Then the narrative is substantially changed.

Or wait like 6mo and the opensource Chinese models [Kimi/Qwen/Friends] will have caught up to Claude and Gemini IMO. Then just run these models quantized locally on Apple Silicon or GPU.

> Dystopian and frankly, gross. Its amazing to me that so many people are willing to give up control over their lives and in this case, their bodies, for the smallest inkling of ease.

I've read people with chronic conditions reporting that chatgpt actually helped them land correct diagnosis that doctors did not consider so people are not just using that for "inkling of ease".

  • Yes, trading your privacy and autonomy for perceived ease is how they are going to steal your future and your freedom.

    • Please read my comment again. If you lived with chronic pain that multiple doctors failed to correctly diagnose and ChatGPT actually suggested correct diagnosis then you wouldn’t call it just perceived ease, but something that made your life much, much better. I’m doctor and I’m all for empowering patients (as long as they consult ChatGPT output with actual doctors). It’s very easy to criticize people resorting to llms if you do not have any rare debilitating condition that’s not correctly diagnosed.

      7 replies →

    • Genuinely curious, what happens to me if the wrong people know about my chronic back pain and GERD?

How is someone seeking for a way to deal with an inherited or environmentally caused illness giving up control of their body?

  • You will be assigned an individualized risk figure that will determine whether or not you are given coverage and treatment. Those decisions will happen without you or any MDs involvement. You will never know it happened and it will follow you for the rest of your life and your children's lives.

    • If they are willing to exert this level of indiscretion with privately sold data, I don't see why they wouldn't just use black market PHA in the absence of availability of the former.

  • Don’t forget that majority of the commenters on this platform live in a country that views suffering in pain from incurable disease as a “god intended way” (and a horse dose of morphine). Take it with a grain of salt.

    • What specific country are you talking about? I've had people close to me suffer health problems for years waiting for treatment because they worshipped the government healthcare system and government doctors so much that they refused to seek any help outside it.

      Problem is, small and solvable health problems become incurable if you don't fix them in time.

    • Setting aside that you're factually incorrect, this sort of negative stereotyping of others based on their nationality (or ethnicity, or race) is inappropriate, especially on this forum. We don't need more bigotry here.

      2 replies →

> Dystopian and frankly, gross. Its amazing to me that so many people are willing to give up control over their lives and in this case, their bodies, for the smallest inkling of ease.

You have to be extremely privileged to say something like this.

a) nobody is giving up control of their lives

b) get off your high horse, son