← Back to context

Comment by alecco

2 years ago

We are far from a conscious entity with willpower and self preservation. This is just like a calculator. But a calculator that can do things that will be like miracles to us humans.

I worry about dangerous humans with the power of gods, not about artificial gods. Yet.

> Conscious entity... willpower

I don't know what that means. Why should they matter?

> Self preservation

This is no more than a fine-tuning for the task, even with current models.

> I worry about dangerous humans with the power of gods, not...

There's no property of the universe that you only have one thing to worry about at a time. So worrying about risk 'A' does not in any way allow us to dismiss risks 'B' through 'Z'.

  • Because people talking about AGI and superintelligence most likely are thinking of something like Skynet.

    • Why worry about the opinion of people who are confused?

      Without using the words 'conscious', 'sentient', 'AGI', or 'intelligence' what do you think about the future capabilities of LM AI and their implications for us humans?

> conscious entity with willpower and self preservation

There’s no good reason to suspect that consciousness implies an instinct for self-preservation. There are plenty of organisms with an instinct for self-preservation that have little or no conscious awareness.

That’s the attitude that’s going to leave us with our pants down when AI starts doing really scary shit.