Comment by strogonoff
2 months ago
If LLMs are like human minds enough, then legally speaking we are abusing thinking and feeling human-like beings possessing will and agency in ways radically worse than slavery.
What is missing in the “if I can remember and recite program then they must be allowed to remember and recite proframs” argument is that you choose to do it (and you have basic human rights and freedoms), and they do not.
We're halfway to Roko's Basilisk here
To be clear I’m not saying I believe that’s the case, merely noting that logically there are two options (either it’s like a human and then it can remember & recite but we can’t abuse it, or more likely it’s just a tool and then the freedom to “remember & recite” is simply not applicable to it, whatever it does is liability of its operator and user).