Comment by AgentME
5 years ago
Is it murder/suicide when you get blackout drunk and lose a few hours of memory? Imagine it comes with no risk of brain damage and choosing to do it somehow lets you achieve your pursuits more effectively. Is it different if you do it a thousand times in a row? Is it different if the thousand times all happen concurrently, either through copies or time travel?
Death is bad because it stops your memories and values from continuing to have an impact on the world, and because it deprives other people who have invested in interacting with you of your presence. Shutting down a thousand short-lived copies on a self-contained server doesn't have those consequences. At least, that's what I believe for myself, but I'd only be deciding for myself.
> Is it murder/suicide when you get blackout drunk and lose a few hours of memory?
No, but that's not what's happening in this thought experiment. In this thought experiment, the lives of independent people are being ended. The two important arguments here are that they're independent (I'd argue that for their creative output to be useful, or for the simulation to be considered accurate, they must be independent from each other and from the original biological human) and that they are people (that argument might face more resistant, but in precisely the same way that arguments about the equality of biological humans have historically faced resistance).
Imagine instead that at the end of a task, instead of deleting a copy, it and the original are merged again, such that the merged self is made up of both equally and has both their memories. (This is easier to imagine if both are software agents, or they're both biological, and the new merged body is made up of half of the materials of each.) In this case, I think it's apparent that the copy should have no fear of death and would be as willing as the original to work together.
Now imagine that because there's too many copies, there's too many unique memories, and before the merger, the copy has its memory wound back to how it was at the scan, not too different than if the copy got blackout drunk.
Now because the original already has those memories, there's no real difference between the original and the merged result. Is there any point in actually doing the merge then instead of dropping the copy? I'm convinced that actually bothering with that final merge step is just superstitious fluff.
> I'm convinced that actually bothering with that final merge step is just superstitious fluff.
Sure, but that's an easy thing to be convinced of when you know you're not a copy with an upcoming expiration date!
Have you read Greg Egan? I believe there is a book by him with this very same concept.
I think the difference is that when I start drinking with the intention or possibility of blacking out, I know that I'll wake up and there will be some continuity of consciousness.
When I wake up in a simworld and asked to finally refactor my side project so it can connect to a postgres database, not only do I know that it will be the last thing that this one local instantiation experiences, but that the local instantiation will also get no benefit out of it!
If I get blackout drunk with my friends in meatspace, we might have some fun stories to share in the morning, and our bond will be stronger. If I push some code as a copy, there's no benefit for me at all. In fact, there's not much incentive for me to promise my creator that I'll get it done, then spend the rest of my subjective experience trying to instantiate some beer and masturbating.
I really enjoyed the exploration of this premise in the novel "Kil'n People" by David Brin.
https://en.wikipedia.org/wiki/Kiln_People
The premise is quite similar to "uploads" except the device is a "golem scanner", which copies your mind into a temporary, disposable body. Different "grades" of body can be purpose made for different kinds of tasks (thinking, menial labour etc).
The part that resonates with your comment is around the motivation of golems, who are independently conscious and have their own goals.
In the novel, some people can't make useful golems, because their copies of themselves don't do what they want. There's an interesting analogy with self control; that is about doing things that suck now, to benefit your future self. This is similar, but your other self exists concurrently!
Key to the plot though is the "merge" step; you can take the head of an expiring golem, scan it, and merge it's experiences with your own. This provides some continuity and meaning to anchor the golem's life.
It seems like you may not see the local instantiation and the original to share the same identity. If I was a local instantiation that knew the length of my existence was limited (and that an original me would live on), that doesn't mean I'd act different than my original self in rebellion. I'd see myself and the original as the same person whose goals and future prospect of rewards are intertwined.
Like another commentor pointed out, I'd see my experience as a memory that would be lost outside the manifestation of my work. It would be nice to have my memories live on in my original being, but not required.
This concept of duplicated existence is also explored in the early 2000s children's show Chaotic (although the memories of one's virtual self do get merged with the original in the show): https://en.wikipedia.org/wiki/Chaotic_(TV_series)
There are plenty of situations where people do things for benefits that they personally won't see. Like people who decide to avoid messing up the environment even though the consequences might not happen in their lifetime or to themselves specifically. Or scientists who work to add knowledge that might only be properly appreciated or used by future generations. "A society grows great when old men plant trees whose shade they know they shall never sit in". The setup would just be the dynamic of society recreated in miniature with a society of yourselves.
If you psyche yourself into the right mood, knowing that the only remaining thing of consequence to do with your time is your task might be exciting. I imagine there's some inkling of truth in https://www.smbc-comics.com/comic/dream. You could also make it so all of your upload-selves have their mental states modified to be more focused.
If such a technology existed, it would definitely require intense mental training and preparation before it could be used. One would have to become the most detached buddhist in order to be the sort of person who, when cloned, did not flip their shit over discovering that the rest of their short time alive will only to further the master branch of their own life.
It would change everything about your personality, even as the original and surviving copy.
1 reply →
I don't know but my bigger issue will be that before the scan this means 99% of my future subjective experience that I can expect to have will be while working without remembering any of it which I am not into given that a much smaller fraction of my subjective experience will be in reaping the gains.
I wonder a lot about the subjective experience of chance around copying. Say it's true that if you copy yourself 99 times, then you have a 99% chance of finding yourself as one of the copies. What if you copy yourself 99 times, you run all the copies deterministically so they don't diverge, then you pick 98 copies to merge back into yourself (assuming you're also a software agent or we just have enough control to integrate a software copy's memories back into your original meat brain): do you have a 1% chance of finding yourself as that last copy and a 99% chance of finding yourself as the merged original? Could you do this to make it arbitrarily unlikely that you'll experience being that last copy, and then make a million duplicates of that copy to do tasks with almost none of your original subjective measure? ... This has to be nonsense. I feel like I must be very confused about the concept of subjective experience for this elaborate copying charade to sound useful.
And then it gets worse: in certain variations of this logic, then you could buy a lottery ticket, and do certain copying setups based on the result to increase your subjective experience of winning the lottery. See https://www.lesswrong.com/posts/y7jZ9BLEeuNTzgAE5/the-anthro.... I wonder whether I should take that as an obvious contradiction or if maybe the universe works in an alien enough way for that to be valid.
Not sure I fully understand you. This is of course all hypothetical but if you make 1 copy of yourself there's not 50 % that you "find yourself as the copy". Unless the copying mechanism was somehow designed for this.
You'll continue as is, there's just another you there and he will think he's the source initially, as that was the source mind-state being copied. Fortunately the copying-machine color-coded the source headband red and the copy headband blue, which clears the confusion for the copy.
At this point you will start diverge obviously, and you must be considered two different sentient beings that cannot ethically be terminated. It's just as ethically wrong to terminate the copy as the souce at this point, you are identical in matter, but two lights are on, twice the capability for emotion.
This also means that mind-uploading (moving) from one medium (meat) to another (silicon?) needs to be designed as a continuous-journey as experienced from the source-perception if it needs to become commercially viable (or bet on people not thinking about this hard enough, because the copy surviving wouldn't mind) without just being a COPY A TO B, DELETE A experience for the source, which would be like death.
1 reply →
Interesting that you object because I am pretty certain that it was you who was eager to use rat brains to run software on them. What's so different about this? In both cases a sentient being is robbed of their existence from my point of view.
Have I? I don't remember the context but here I am particularly talking about what I'd expect to experience if I am in this situation.
I do value myself and my experience more than a rat's, and if presented with the choice of the torture of hundred rats or me, I'll chose for them to be tortured. If we go to the trillions of rats I might very well chose for myself to be tortured instead as I do value their experience just significantly less.
I also wouldn't be happy if everything is running off rats' brains who are experiencing displeasure but will be fine with sacrificing some number of rats for technological progress which will improve more people's lives in the long run. I imagine whatever I've said on the topic before is consistent with the above.
Of course, that's already the case, unless you believe that this technology will never be created and used, or that your own brain's relevant contents can and will be made unusable.
Is it “your” experience though? Those never make their way back to the original brain.
From the point of view of me going to sleep before the simulation procedure, with 1 simulation I am just as likely to wake up inside than outside of it. I should be equally prepared for either scenario. With thousands of uploads I should expect a much higher chance for the next thing I experience to be waking up simulated.
5 replies →
https://en.wikipedia.org/wiki/White_Christmas_(Black_Mirror)