← Back to context

Comment by jdross

1 day ago

Spend 15 minutes talking to a person in their 20's about how they use ChatGPT to work through issues in their personal lives and you'll see how much they already trust the "advice" and other information produced by LLMs.

Manipulation is a genuine concern!

Netflix needs to do a Black Mirror episode where either a sentient AI pretends that it's "dumber" than it is while secretly plotting to overthrow humanity. Either that or a LLM is hacked by deep state actors that provides similar manipulated advice.

It's not just young people. My boss (originally a programmer) agreed with me that there's lots of problems using ChatGPT for our products and programs as it gives the wrong answers too often, but tgen 30 seconds later told me that it was apparently great at giving medical advice.

...later someone higher-up decided that it's actually great at programming as well, and so now we all believe it's incredibly useful and necessary for us to be able to do our daily work

  • Most doctors will prescribe antibiotics for viral infections just to get you out and the next guy in, they have zero interest in sitting there to troubleshoot with you.

    For this reason o3 is way better than most of the doctors I've had access to, to the point where my PCP just writes whatever I brought in because she can't follow 3/4 of it.

    Yes, the answers are often wrong and incomplete, and it's up to you to guide the model to sort it out, but it's just like vibe coding: if you put in the steering effort, you can get a decent output.

    Would it be better if you could hire an actual professional to do it? Of course. But most of us are priced out of that level of care.