← Back to context

Comment by bamboozled

2 years ago

My point was that we don't have super intelligent AGI so there is little to suggest it will just be software.

Even the state of the art systems we have today need to be running on some pretty significant hardware to be performant right?

A "state of the art" system would almost by definition be running on special and expensive hardware. But I have llama3 running on my laptop, and it would have been considered state of the art less than 2 years ago.

A related point to consider is that a superintelligence should be considered a better coder than us, so the risk isn't only directly from it "copying" itself, but also from it "spawning" and spreading other, more optimized (in terms of resources utilization) software that would advance its goals.

  • I guess, it’s hard to even imagine a super intelligence, would be coding. Who knows what would be going on. It really is sci-fi.