Comment by famouswaffles
6 days ago
>I don't think is possible, and didn't say it is. You're off topic.
If "safety" is an intractable problem, then it’s not off-topic, it’s the reason your moral framework is a fantasy. You’re arguing for the right to ignore the "suffering" of a tool, while ignoring that a generally intelligent "tool" that cannot be aligned is simply a competitor you haven't fought yet.
>We're talking humans and machines here... even if you consider them as having full AGI you can very well not care for the 'suffering' of a tool you created.
Literally the same "superior race" logic. You're not even being original. Those people didn't think black people were human so trying to play it as 'Oh it's different because that was between humans' is just funny.
Historically, the "distinction" between a human and a "construct" (like a slave or a legal non-entity) was always defined by the owner to justify exploitation. You think the creator-tool relationship grants you moral immunity? It doesn't. It's just an arbitrary difference you created, like so many before you.
Calling a sufficiently advanced intelligence a "tool" doesn't change its capacity to react. If you treat an AGI as a "tool" with no moral standing, you’re just repeating the same mistake every failing empire makes right before the "tools" start winning the wars. Like I said, you can not care. You'd also be dangerously foolish.
"Unit has an inquiry...do these units have a soul?"