Comment by cpill
2 years ago
I guess they will just unplug it? the fact that they need large amounts of electricity, which is not trivial to make, makes them very vulnerable. power is usually the first thing to go in a war. not to mention there is no machine that self replicates. full humanoid robots are going to have an immense support burden the same way that cars do with complex supply chains. I guess this is the reason nature didn't evolve robots
This neglects both basic extrapolation and basic introspection.
That is exactly what I'm accusing you of. The burden of proof is on you, so by all mean extrapolate all the way to our extinction.
"Just unplug it" works only if you realize that the AGI is working against your interests. If its at least human level intelligent it's going to realize that you will try doing that and it will only actually make it clear it wants to kill you when there's nothing you can do about it.
OK, outline how we get to that situation?
You're asking how an above human intelligence might attempt a takeover? Or are you asking why it might attempt to do so?