Comment by ben_w

10 days ago

> The clone is not the same person.

Then it wasn't a good attempt at making a mind clone.

I suspect this will actually be the case, which is why I oppose it, but you do actually have to start from the position that the clone is immediately divergent to get to your conclusions; to the extent that the people you're arguing with are correct (about this future tech hypothetical we're not really ready to guess about) that the clone is actually at the moment of their creation identical in all important ways to the original, then if the original was consenting the clone must also be consenting:

Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.

> It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences.

And?

> you do actually have to start from the position that the clone is immediately divergent to get to your conclusions

Eventual divergence seems to be enough, and I don't think this requires any particularly strong assumptions.

  • If divergence were an argument against the clone having been created, by symmetry it is also an argument against the living human having been allowed to exist beyond the creation of the clone.

    The living mind may be mistreated, grow sick, die a painful death. The uploaded mind may be mistreated, experience something equivalent.

    Those sufferances are valid issues, but they are not arguments for the act of cloning itself to be considered a moral issue.

    Uncontrolled diffusion of such uploads may be; I could certainly believe a future in which, say, every American politician gets a thousand copies of their mind stuck in a digital hell created by individual members the other party on computers in their basements that the party leaders never know about. But then, I have read Surface Detail by Iain M Banks.

>Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.

This is false. The clone is necessarily a different person, because consciousness requires a physical substrate. Its memories of consenting are not its own memories. It did not actually consent.

  • > Its memories of consenting are not its own memories. It did not actually consent.

    Let's say as soon as it wakes up, you ask it if it still consents, and it says yes. Is that enough to show there's sufficient consent for that clone?

    (For this question, don't worry about it saying no, let's say we were sure with extreme accuracy that the clone would give an enthusiastic yes.)

  • You deny the premise of the position you argue against.

    I would also deny it, but my position is a practical argument, yours is pretending to be a fundamental one.

    • The premise of the position is that it's theoretically possible to create a person with memories of being another person. I obviously don't deny that or there would be no argument to have.

      Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.

      2 replies →