Comment by supafastcoder
2 years ago
imagine the hubris and arrogance of trying to control a “superintelligence” when you can’t even control human intelligence
2 years ago
imagine the hubris and arrogance of trying to control a “superintelligence” when you can’t even control human intelligence
No more so than trying to control a supersonic aircraft when we can't even control pigeons.
I know nothing about physics. If I came across some magic algorithm that occasionally poops out a plane that works 90 percent of the time, would you book a flight in it?
Sure, we can improve our understanding of how NNs work but that isn't enough. How are humans supposed to fully understand and control something that is smarter than themselves by definition? I think it's inevitable that at some point that smart thing will behave in ways humans don't expect.
> I know nothing about physics. If I came across some magic algorithm that occasionally poops out a plane that works 90 percent of the time, would you book a flight in it?
With this metaphor you seem to be saying we should, if possible, learn how to control AI? Preferably before anyone endangers their lives due to it? :)
> I think it's inevitable that at some point that smart thing will behave in ways humans don't expect.
Naturally.
The goal, at least for those most worried about this, is to make that surprise be not a… oh, I've just realised a good quote:
""" the kind of problem "most civilizations would encounter just once, and which they tended to encounter rather in the same way a sentence encountered a full stop." """ - https://en.wikipedia.org/wiki/Excession#Outside_Context_Prob...
Not that.
2 replies →
Correct, pidgeons are much more complicated and unpredictable than supersonic aircraft, and the way they fly is much more complex.
I can shoot down a pigeon that’s overhead pretty easily, but not so with an overhead supersonic jet.
If that's your standard of "control", then we can definitely "control" human intelligence.
[dead]