Comment by tcgv
2 years ago
Ten years from now will either be:
a) Remember all that fuss about AI destroying the world? Lol.
~ or ~
b) I'm so glad those people stepped in to save us from doom!
Which one do you think is more likely?
2 years ago
Ten years from now will either be:
a) Remember all that fuss about AI destroying the world? Lol.
~ or ~
b) I'm so glad those people stepped in to save us from doom!
Which one do you think is more likely?
Unless AI starts being 1 000 000x energy efficient, my money is on a).
The amount of energy required for AI to be dangerous to its creators is so vast that I can't see how it can realistically happen.
That depends on how its used. See the terminator movies. One false positive is enough to end the world with even current AI tech if its merely mated to a nuclear arsenal (even a small one might see a global escalation). There have been false positives before, and the only reason why they didn't end in nuclear Armageddon was because the actual operators hesitated and defied standard protocol, which probably would have lead to the end the world as we know it.
We know that we can run human level intelligence with relative efficiency.
Without discussing timelines, it seems obvious that human energy usage should be an upper bound on the best possible energy efficiency of intelligence.
If we manage to harness the ego energy transpiring from some people working on "AI" we should be halfway there!
It will never be B even if the “safetyists” are correct.
We rarely notice the near catastrophic misses except in obvious cases where we accidentally drop a nuke or something.
I'll bite... "a"
Or: c)
Fair enough!
c) Humanity unleashed AI superintelligence, but safeguards proved inadequate, leading to our extinction
There, finally all the alternatives. So, doesn't that show that your original question was meaningless? Because, if there is any non-zero probability that the correct answer is c), should anyone really fuck around with this shit, regardless of how small the probability is?
1 reply →