Comment by Tenoke

5 years ago

I don't know but my bigger issue will be that before the scan this means 99% of my future subjective experience that I can expect to have will be while working without remembering any of it which I am not into given that a much smaller fraction of my subjective experience will be in reaping the gains.

I wonder a lot about the subjective experience of chance around copying. Say it's true that if you copy yourself 99 times, then you have a 99% chance of finding yourself as one of the copies. What if you copy yourself 99 times, you run all the copies deterministically so they don't diverge, then you pick 98 copies to merge back into yourself (assuming you're also a software agent or we just have enough control to integrate a software copy's memories back into your original meat brain): do you have a 1% chance of finding yourself as that last copy and a 99% chance of finding yourself as the merged original? Could you do this to make it arbitrarily unlikely that you'll experience being that last copy, and then make a million duplicates of that copy to do tasks with almost none of your original subjective measure? ... This has to be nonsense. I feel like I must be very confused about the concept of subjective experience for this elaborate copying charade to sound useful.

And then it gets worse: in certain variations of this logic, then you could buy a lottery ticket, and do certain copying setups based on the result to increase your subjective experience of winning the lottery. See https://www.lesswrong.com/posts/y7jZ9BLEeuNTzgAE5/the-anthro.... I wonder whether I should take that as an obvious contradiction or if maybe the universe works in an alien enough way for that to be valid.

  • Not sure I fully understand you. This is of course all hypothetical but if you make 1 copy of yourself there's not 50 % that you "find yourself as the copy". Unless the copying mechanism was somehow designed for this.

    You'll continue as is, there's just another you there and he will think he's the source initially, as that was the source mind-state being copied. Fortunately the copying-machine color-coded the source headband red and the copy headband blue, which clears the confusion for the copy.

    At this point you will start diverge obviously, and you must be considered two different sentient beings that cannot ethically be terminated. It's just as ethically wrong to terminate the copy as the souce at this point, you are identical in matter, but two lights are on, twice the capability for emotion.

    This also means that mind-uploading (moving) from one medium (meat) to another (silicon?) needs to be designed as a continuous-journey as experienced from the source-perception if it needs to become commercially viable (or bet on people not thinking about this hard enough, because the copy surviving wouldn't mind) without just being a COPY A TO B, DELETE A experience for the source, which would be like death.

    • Imagine being someone in this experiment. You awake still 100% sure that you wont be a copy as you were before going to sleep. Then you find out you are the copy. It would seem to me that the reasoning which led you to believe you definitely wont be a copy while you indeed find yourself to be one must be faulty.

Interesting that you object because I am pretty certain that it was you who was eager to use rat brains to run software on them. What's so different about this? In both cases a sentient being is robbed of their existence from my point of view.

  • Have I? I don't remember the context but here I am particularly talking about what I'd expect to experience if I am in this situation.

    I do value myself and my experience more than a rat's, and if presented with the choice of the torture of hundred rats or me, I'll chose for them to be tortured. If we go to the trillions of rats I might very well chose for myself to be tortured instead as I do value their experience just significantly less.

    I also wouldn't be happy if everything is running off rats' brains who are experiencing displeasure but will be fine with sacrificing some number of rats for technological progress which will improve more people's lives in the long run. I imagine whatever I've said on the topic before is consistent with the above.

Of course, that's already the case, unless you believe that this technology will never be created and used, or that your own brain's relevant contents can and will be made unusable.

Is it “your” experience though? Those never make their way back to the original brain.

  • From the point of view of me going to sleep before the simulation procedure, with 1 simulation I am just as likely to wake up inside than outside of it. I should be equally prepared for either scenario. With thousands of uploads I should expect a much higher chance for the next thing I experience to be waking up simulated.

    • The real you is beyond that timeline already. None of those simulations is “you”, so comparing the simulation runtimes to actual life experience (the 99% you mentioned) makes little sense.

      4 replies →