DeVliegendeHollander comments on Open Thread, Jul. 13 - Jul. 19, 2015 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (297)
Cryonics is an ambulance ride through an earthquake zone to the nearest revival facility, The distance is measured in years rather than miles, and the earthquake is the chances of history. The better the preservation, the lower the technology required to revive you, and the sooner you will reach a facility that can do it.
A "powerful enough" AI isn't magic: it cannot recover information that no longer exists. We currently don't know what must be preserved and what is redundant, beyond just "keep the brain, the rest of the body can probably be discarded, but we'll freeze it as well at extra cost if you want."
On a present-day level, the feted accomplishments of Deep Learning suggest to me that setting such algorithms to munch over a person's highly documented life might be enough to enable a more or less plausible simulation of them after death. Plausible enough at least to be offered as a comfort to the bereaved. A market opportunity! Also, fuel for a debate on whether these simulations are people.
Can you recommend an article about what is the difference between the simulation of a person vs. "really" reviving a person? Primarily from the angle of: why should I or anyone would consider someone in the future making a plausible simulation of us is good for "us" ? I am really confused about the identity of a person i.e. when is a simulation is really "me" in the sense of me having a self-interest about that situation. I am heavily influenced by Buddhist ideas saying such an identity does not exist, is illusionary. I currently think the closest thing to this is memories, if I exist at all, I exist as something that remembers what happened to this illusion-me. I see this as a difficult philosophical problem and don't know how to relate to it.
Same here. My own attitude is that we do not currently have software for which the question of it being any more conscious than a rock arises, nor any route to making such software. Therefore I am not going to worry about it. While it may be interesting for philosophers, I relate to the problem by ignoring it, or engaging in it no further than as an idle recreation.
I view it from a practical viewpoint: Even if you believe the Buddhist view, that the self is an illusion etc. you still feel like you have a self for >95% of the time (i.e. whenever you're not meditating). When you wake up in the morning you feel like you are the same person that went to sleep the evening before. On the other hand, a clone of you would not feel like it is you anymore than one identical twin feels it is the other. So ideally people in the future should create a person/simulation that feels like it went to sleep and woke up again when it "should" have died.
Problems arise mainly when you hit something that only partially feels like it is the same person. I'd say there is still a considerable range of possible people that are sufficiently similar that we say it is the same person, since there is also considerable variation in the normal functioning of human brains.
E.g.:
I wonder whether it is possible to find some sort of "core" personality/traits/memories, such that we can say as long as it remains unchanged it is the same person. I suspect there isn't, as it seems to be a gradient instead of a binary classification.