← Back to context

Comment by GuB-42

2 years ago

We already have limited "artificial superintelligences". A pocket calculator is better at calculating than the best humans, and we certainly put calculators to good use. What we call AIs are just more generic versions of tools like pocket calculators, or guns.

And that's the key, it is a tool, a tool that will give a lot of power to whoever is controlling it. And that's where safety matters, it should be made so that it helps good guys more than it helps bad guys, and limit accidents. How? I don't know. Maybe people at SSI do. We already know that the 3 laws of robotics won't work, Asimov only made them to write stories about how broken they are :)

Current-gen AIs are already cause for concern. They are shown to be good at bullshitting, something that bad people are already taking advantage of. I don't believe in robot apocalypse, technological singularities, etc... but some degree of control, like we do with weapons is not a bad thing. We are not there yet with AI, but we might be soon.