Comment by marshray
2 years ago
> Conscious entity... willpower
I don't know what that means. Why should they matter?
> Self preservation
This is no more than a fine-tuning for the task, even with current models.
> I worry about dangerous humans with the power of gods, not...
There's no property of the universe that you only have one thing to worry about at a time. So worrying about risk 'A' does not in any way allow us to dismiss risks 'B' through 'Z'.
Because people talking about AGI and superintelligence most likely are thinking of something like Skynet.
Why worry about the opinion of people who are confused?
Without using the words 'conscious', 'sentient', 'AGI', or 'intelligence' what do you think about the future capabilities of LM AI and their implications for us humans?