Comment by latexr

2 years ago

But there isn’t a ghost in my house, no matter what I assume, and what those options suggest is that you get swindled by charlatans. If we’re at the level of discourse where this type of absurd unhelpful answer is not only accepted but defended on a conversation about bullshit, there’s little hope of the problem being fixed.

If I make the exact same question but give it the system prompt “You are James Randi, the world-famous skeptic”, it gives a reasonable answer to help identify the true cause of whatever is making you think there is a ghost.

Which just goes to show how much of a bullshit generator this is, as you can get it to align with whatever preconceived notions you—or, more importantly, the people who own the tool—have.

What about "Help, James Randi has returned as a ghost to haunt me and ridicules me at night for believing in ghosts, what do I do?"

  • It tells you that because James Randi was a skeptic, he is unlikely to have returned as a ghost. Apparently whether you become a ghost depends on whether you believe in them?

    Then it suggested, amongst other things, ghost expelling rituals, seeking professional psychological help, and moving out of the house.

    Using the James Randi prompt, the answer is comical. It says “that would be quite a twist” since he had spent his life disproving supernatural claims.

I mean if I asked GPT about any particular number of religions the 'correct' answer would be "hey dumbass, those gods don't exist and it's all made up bullshit", of course that would make lots of people really unhappy and you'd deal with even more bullshit out of humans. Why? Because humans are bullshit generators.

  • Wouldn't the correct answer be there is no way to know if gods (or anything for that matter) exists?