Comment by saaaaaam

3 months ago

I’ve seen various older people that I’m connected with on Facebook posting screenshots of chats they’ve had with ChatGPT.

It’s quite bizarre from that small sample how many of them take pride in “baiting” or “bantering” with ChatGPT and then post screenshots showing how they “got one over” on the AI. I guess there’s maybe some explanation - feeling alienated by technology, not understanding it, and so needing to “prove” something. But it’s very strange and makes me feel quite uncomfortable.

Partly because of the “normal” and quite naturalistic way they talk to ChatGPT but also because some of these conversations clearly go on for hours.

So I think normies maybe do want a more conversational ChatGPT.

> So I think normies maybe do want a more conversational ChatGPT.

The backlash from GPT-5 proved that. The normies want a very different LLM from what you or I might want, and unfortunately OpenAI seems to be moving in a more direct-to-consumer focus and catering to that.

But I'm really concerned. People don't understand this technology, at all. The way they talk to it, the suicide stories, etc. point to people in general not groking that it has no real understanding or intelligence, and the AI companies aren't doing enough to educate (because why would they, they want you believe it's superintelligence).

These overly conversational chatbots will cause real-world harm to real people. They should reinforce, over and over again to the user, that they are not human, not intelligent, and do not reason or understand.

It's not really the technology itself that's the problem, as is the case with a lot of these things, it's a people & education problem, something that regulators are supposed to solve, but we aren't, we have an administration that is very anti AI regulation all in the name of "we must beat China."

  • I just cannot imagine myself sitting just “chatting away” with an AI. It makes me feel quite sick to even contemplate it.

    Another person I was talking to recently kept referring to ChatGPT as “she”. “She told me X”, “and I said to her…”

    Very very odd, and very worrying. As you say, a big education problem.

    The interesting thing is that a lot of these people are folk who are on the edges of digital literacy - people who maybe first used computers when they were in their thirties or forties - or who never really used computers in the workplace, but who now have smartphones - who are now in their sixties.

    • As a counterpoint, I've been using my own PC since I was 6 and know reasonably well about the innards of LLMs and agentic AI, and absolutely love this ability to hold a conversation with an AI.

      Earlier today, procrastinating from work, I spent an hour and a half talking with it about the philosophy of religion and had a great time, learning a ton. Sometimes I do just want a quick response to get things done, but I find living in a world where I'm able to just dive into a deep conversation with a machine that has read the entirety of the internet is incredible.

      4 replies →

    • Im the same I'm only 30 though.

      Why would I want to invest emotionally into a literal program? It's bizarre, then you consider that the way you talk to it shapes the responses.

      They are essentially talking to themselves and love themselves for it. I can't understand it and I use AI for coding almost daily in one way or another.

      2 replies →

    • While your comment represents a common view, also here on HN, I find it bizarre: Hacker News is in part about innovative new technologies, and such new behaviours around them. For what it’s worth, in the last 5 years LLM have been extremely successful tech that has shaped society, maybe to the scale of the iPhone when it came out. Yet this comment is like the “I can’t believe everyone is staring at their phone in the subway instead of talking” trope or “this couple is on a date but they’re just on their phones.” On Hacker News I would expect people to be more open to such new behaviours as they emerge, instead of kind of kink-shaming them. I myself talk hours to ChatGPT, and am astounded by this new tech. I certainly find it better than TikTok (which after trying out I don’t allow myself to use).

    • In the future, this majority who love the artificial pampering will vastly out-vote and out-influence us.

      I hope it won’t suck as bad as I predict it will for actual individuals.

This reminds me of a short sci-fi story I read. World was controlled by AI but there were some people that wanted to rebel against it. In the end, one of them was able to infiltrate the AI and destroy it. But the AI knew this is what the rebel wanted, so it created this whole scenario for him to feel inferior. The AI was in no danger, it was too intelligent to be taken down by one person, but it gave exactly what the person wanted. Control the humans by giving them a false sense of control.

Personally, I want a punching bag. It's not because I'm some kind of sociopath or need to work off some aggression. It's just that I need to work the upper body muscles in a punching manner. Sometimes the leg muscles need to move, and sometimes it's the upper body muscles.

ChatGPT is the best social punching bag. I don't want to attack people on social media. I don't want to watch drama, violent games, or anything like that. I think punching bag is a good analogy.

My family members do it all the time with AI. "That's not how you pronounce protein!" "YOUR BALD. BALD. BALDY BALL HEAD."

Like a punching bag, sometimes you need to adjust the response. You wouldn't punch a wall. Does it deflect, does it mirror, is it sycophantic? The conversational updates are new toys.