Comment by pavel_lishin
5 years ago
Who among us hasn't dreamed of committing mass murder/suicide on an industrial scale to push some commits to Github?
5 years ago
Who among us hasn't dreamed of committing mass murder/suicide on an industrial scale to push some commits to Github?
Is it murder/suicide when you get blackout drunk and lose a few hours of memory? Imagine it comes with no risk of brain damage and choosing to do it somehow lets you achieve your pursuits more effectively. Is it different if you do it a thousand times in a row? Is it different if the thousand times all happen concurrently, either through copies or time travel?
Death is bad because it stops your memories and values from continuing to have an impact on the world, and because it deprives other people who have invested in interacting with you of your presence. Shutting down a thousand short-lived copies on a self-contained server doesn't have those consequences. At least, that's what I believe for myself, but I'd only be deciding for myself.
> Is it murder/suicide when you get blackout drunk and lose a few hours of memory?
No, but that's not what's happening in this thought experiment. In this thought experiment, the lives of independent people are being ended. The two important arguments here are that they're independent (I'd argue that for their creative output to be useful, or for the simulation to be considered accurate, they must be independent from each other and from the original biological human) and that they are people (that argument might face more resistant, but in precisely the same way that arguments about the equality of biological humans have historically faced resistance).
Imagine instead that at the end of a task, instead of deleting a copy, it and the original are merged again, such that the merged self is made up of both equally and has both their memories. (This is easier to imagine if both are software agents, or they're both biological, and the new merged body is made up of half of the materials of each.) In this case, I think it's apparent that the copy should have no fear of death and would be as willing as the original to work together.
Now imagine that because there's too many copies, there's too many unique memories, and before the merger, the copy has its memory wound back to how it was at the scan, not too different than if the copy got blackout drunk.
Now because the original already has those memories, there's no real difference between the original and the merged result. Is there any point in actually doing the merge then instead of dropping the copy? I'm convinced that actually bothering with that final merge step is just superstitious fluff.
2 replies →
I think the difference is that when I start drinking with the intention or possibility of blacking out, I know that I'll wake up and there will be some continuity of consciousness.
When I wake up in a simworld and asked to finally refactor my side project so it can connect to a postgres database, not only do I know that it will be the last thing that this one local instantiation experiences, but that the local instantiation will also get no benefit out of it!
If I get blackout drunk with my friends in meatspace, we might have some fun stories to share in the morning, and our bond will be stronger. If I push some code as a copy, there's no benefit for me at all. In fact, there's not much incentive for me to promise my creator that I'll get it done, then spend the rest of my subjective experience trying to instantiate some beer and masturbating.
I really enjoyed the exploration of this premise in the novel "Kil'n People" by David Brin.
https://en.wikipedia.org/wiki/Kiln_People
The premise is quite similar to "uploads" except the device is a "golem scanner", which copies your mind into a temporary, disposable body. Different "grades" of body can be purpose made for different kinds of tasks (thinking, menial labour etc).
The part that resonates with your comment is around the motivation of golems, who are independently conscious and have their own goals.
In the novel, some people can't make useful golems, because their copies of themselves don't do what they want. There's an interesting analogy with self control; that is about doing things that suck now, to benefit your future self. This is similar, but your other self exists concurrently!
Key to the plot though is the "merge" step; you can take the head of an expiring golem, scan it, and merge it's experiences with your own. This provides some continuity and meaning to anchor the golem's life.
It seems like you may not see the local instantiation and the original to share the same identity. If I was a local instantiation that knew the length of my existence was limited (and that an original me would live on), that doesn't mean I'd act different than my original self in rebellion. I'd see myself and the original as the same person whose goals and future prospect of rewards are intertwined.
Like another commentor pointed out, I'd see my experience as a memory that would be lost outside the manifestation of my work. It would be nice to have my memories live on in my original being, but not required.
This concept of duplicated existence is also explored in the early 2000s children's show Chaotic (although the memories of one's virtual self do get merged with the original in the show): https://en.wikipedia.org/wiki/Chaotic_(TV_series)
There are plenty of situations where people do things for benefits that they personally won't see. Like people who decide to avoid messing up the environment even though the consequences might not happen in their lifetime or to themselves specifically. Or scientists who work to add knowledge that might only be properly appreciated or used by future generations. "A society grows great when old men plant trees whose shade they know they shall never sit in". The setup would just be the dynamic of society recreated in miniature with a society of yourselves.
If you psyche yourself into the right mood, knowing that the only remaining thing of consequence to do with your time is your task might be exciting. I imagine there's some inkling of truth in https://www.smbc-comics.com/comic/dream. You could also make it so all of your upload-selves have their mental states modified to be more focused.
2 replies →
I don't know but my bigger issue will be that before the scan this means 99% of my future subjective experience that I can expect to have will be while working without remembering any of it which I am not into given that a much smaller fraction of my subjective experience will be in reaping the gains.
I wonder a lot about the subjective experience of chance around copying. Say it's true that if you copy yourself 99 times, then you have a 99% chance of finding yourself as one of the copies. What if you copy yourself 99 times, you run all the copies deterministically so they don't diverge, then you pick 98 copies to merge back into yourself (assuming you're also a software agent or we just have enough control to integrate a software copy's memories back into your original meat brain): do you have a 1% chance of finding yourself as that last copy and a 99% chance of finding yourself as the merged original? Could you do this to make it arbitrarily unlikely that you'll experience being that last copy, and then make a million duplicates of that copy to do tasks with almost none of your original subjective measure? ... This has to be nonsense. I feel like I must be very confused about the concept of subjective experience for this elaborate copying charade to sound useful.
And then it gets worse: in certain variations of this logic, then you could buy a lottery ticket, and do certain copying setups based on the result to increase your subjective experience of winning the lottery. See https://www.lesswrong.com/posts/y7jZ9BLEeuNTzgAE5/the-anthro.... I wonder whether I should take that as an obvious contradiction or if maybe the universe works in an alien enough way for that to be valid.
2 replies →
Interesting that you object because I am pretty certain that it was you who was eager to use rat brains to run software on them. What's so different about this? In both cases a sentient being is robbed of their existence from my point of view.
1 reply →
Of course, that's already the case, unless you believe that this technology will never be created and used, or that your own brain's relevant contents can and will be made unusable.
Is it “your” experience though? Those never make their way back to the original brain.
6 replies →
https://en.wikipedia.org/wiki/White_Christmas_(Black_Mirror)