Comment by stoniejohnson

2 years ago

Given your initial assumptions, that self-moderating end state makes sense.

I feel like we still have a disconnect on our definition of a super intelligence.

From my perspective this thing is insanely smart. We can hold ~4 things in our working memory (maybe Von Neumann could hold like 6-8); I'm thinking this thing can hold on the order of millions of things within its working memory for tasks requiring fluid intelligence.

With that sort of gap, I feel like at minimum the ASI would be able to trick the cleverest human to do anything, but more reasonably, humans might appear to be entirely close formed to it, where getting a human to do anything is more of a mechanistic thing rather than a social game.

Like the reason my early example was concrete pillars with weird wires is that with an intelligence gap so big the ASI will be doing things quickly that don't make sense, having a strong command over the world around it.