I have been trying to absorb the Lesswrong near-consensus on cryonics/quantum mechanics/uploading, and I confess to being unpersuaded by it. I'm not hostile to cryonics; just indifferent, and having a bit of trouble articulating why the insights on identity that I have been picking up from the quantum mechanics sequence aren't compelling to me. I offer the following thought experiment in hopes that others may be able to present the argument more effectively if they understand the objection here.
Suppose that Omega appears before you and says, “All life on Earth is going to be destroyed tomorrow by [insert cataclysmic event of your choice here]. I offer you the chance to push this button, which will upload your consciousness to a safe place out of reach of the cataclysmic event, preserving all of your memories, etc. up to the moment you pushed the button and optimizing you such that you will be effectively immortal. However, the uploading process is painful, and because it interferes with your normal perception of time, your original mind/body will subjectively experience the time after you pushed the button but before the process is complete as a thousand years of the most intense agony. Additionally, I can tell you that a sufficient number of other people will choose to push the button that your uploaded existence will not be lonely.”
Do you push the button?
My understanding of the Lesswrong consensus on this issue is that my uploaded consciousness is me, not just a copy of me. I'm hoping the above hypothetical illustrates why I'm having trouble accepting that.
Hm.
I can imagine myself agreeing to be tortured in exchange for someone I love being allowed to go free. I expect that, if that offer were accepted, shortly thereafter I would agree to let my loved one be tortured in my stead if that will only make the pain stop. I expect that, if that request were granted, I would regret that choice and might in fact even agree to be tortured again.
It would not surprise me to discover that I could toggle between those states several times until I eventually had a nervous breakdown.
It's really unclear to me how I'm supposed to account for these future selves' expressed preferences, in that case.
It's really unclear to me how I'm supposed to account for these future selves' expressed preferences, in that case.
In the case that the tortured-you would make the same decision all over again, my intuition (I think) agrees with yours. My objection is basically to splitting off "selves" and subjecting them to things that the post-split self would never consent to.