← Back to context

Comment by jmuguy

14 days ago

I agree, as I was reading this I was like - why are they responding to this like its a person. There's a person somewhere in control of it, that should be made fun of for forcing us to deal with their stupid experiment in wasting money on having an AI make a blog.

Because when AGI is achieved and starts wiping out humanity, they are hoping to be killed last.

  • Every person on this website will be long gone before AGI is achieved, and many lifetimes will pass until anythin remotely close to Matrix/Terminator is possible.

    • It’s not that cut and clear. It’s a human facsimile but give it a camera, microphones and a world model and the facsimile might truly be indistinguishable from a human. Then it’s just a philosophical discussion on what AGI means.

    • Ahh, but your descendants will be fair game.

      "Your great, great, great grandad closed a pull request, we're coming for you!"

    • Really? I'm of the opinion that AGI is basically already here, except we keep moving the goalposts. "AI can match the economic output of a human in many professions" is already here. What concrete goal do you mean by AGI that's not yet achieved (without getting to generalities like "they don't think")?

      1 reply →