Comment by kypro
1 day ago
Controversially I'd argue that there is likely an optimal and stable level of technological advancement which we would be wise to not to cross. That said, we are human so we will, I'd just rather it happened in a couple hundred years rather than a decade or two.
For example, it's hard to imagine an AI which gives us the capability to cure cancer, but doesn't give us the capability to create target super viruses.
Nick Bostrom's Vulnerable World Hypothesis more or less describes my own concerns, https://nickbostrom.com/papers/vulnerable.pdf
At some point we should probably try to resist the urge to pick balls out of the urn as we may eventually pull out a ball we don't want.
Also controversially, it isn't clear to me that perfect totalitarianism (what he calls solutions 3 and 4) is a preferable outcome to devastation.