Guaranteed death places a limit on the value of my life to myself
It puts a limit on the value of other lives, too.
Whatever a life is worth, so long as it's the same factor affecting the potential worth of all lives, the dilemma of altruism or selfishness is the same.
Your overall point seems to be: "If some people live a really, really long time, and others don't, we won't value the lives of the 'mortals' as much as we do those of the 'immortals.'"
But don't we value saving nine-year-olds more than ninety-year-olds? The real question is, "If I'm immortal, why aren't they?"
You also miss the obvious positive effects of valuing life more greatly. War would be virtually impossible between immortal nations, at least insofar as it requires public support and soldiers. It would also be (to some degree) morally defensible for immortal nations to value citizen-lives higher than they value the lives of mortal-nations, which means they would be more willing to use extreme force, which means mortal nations would be much more hesitant to provoke immortal nations. Also, our expenditures on safety and disaster preparedness would probably increase exponentially, and our risk-taking would also decrease dramatically.
In other words, I'm not sure this post clearly communicates your point, and, to the extent it does, your point seems underdeveloped and quite probably bad.
What does she say that convinces you?
"The entity that gave you instruction did not provide you adequate evidence in support of its claims! The odds that it's just messing with you are more orders of magnitude than you can count more likely than the truth of its statement."
What does she say that convinces you?
She doesn't have to say anything - she would have had to push herself well out of the norm and into the range of "people whose richly deserved death would improve the world" before I would even consider it.
I would just say "Omega, you're a bastard", and continue living normally.
What does she say that convinces you?
Imagine Omega came to you and said, "Cryonics will work; it will be possible for you to be resurrected and have the choice between a simulation and a new healthy body, and I can guarantee you live for at least 100,000 years after that. However, for reasons I won't divulge, your surviving to experience this is wholly contingent upon you killing the next three people you see.
This offer could have positive expected value in terms of number of lives if, for example, you were a doctor who expected to save more than three lives during the next 100,000 ye...
I just feel like saying this:
FOOLISH MORTAL!
Sorry. (I don't mean anyone here, I just had to say it.)
Cryonics is good because life is good. The subjective value of my life doesn't make it ok to kill someone I perceive as less valuable.
Here's another argument against: if murder suddenly becomes a defensible position in support of cryonics, then how do you think society, and therefore societal institutions, will respond if murder becomes the norm? I think it becomes less likely that cryonic institutions will succeed, and thus jeopardize everyone's chances of living 100,000+ years.
People's willingness to sacrifice their own lives might change drastically, agreed.
But there are counteracting factors.
People will think far more long term and save more. They might even put more thought into planning. The extra saving might result in an extra safety widget that saves more lives. You can't really disregard that.
They will be more polite and more honest . Because life's too long and the world's too small. Think ten times before cheating anyone. The extra business that will generate and the prosperity that will bring might save more lives th...
There are other similar dilemmas like: why do you go to the cinema if that money could be spent saving one of 16,000 children who die from hunger every day?
My answer is: we are all selfish beings but whereas in our primitive environment(cavemen) the disparities wouldn't be that great for lack of technology nowadays those who have access to the latest technology can leverage much more advantage for them. But unfortunately if you have to make the decision between cryonics for yourself vs. saving N children from starvation: If you still want to be alive in 10...
I would be interested to hear from those who are actually signed up for cryonics. In what ways, if any, have you changed your willingness to undertake risks?
For example, when flying, do you research the safety records of the airlines that you might travel with, and always choose the best? Do you ride a motorbike? Would you go skydiving or mountaineering? Do you cross the road more carefully since discovering that you might live a very long time? Do you research diet, exercise, and other health matters? Do you always have at the back of your mind the though...
My actual reaction in the scenario you describe would be to say "Piss off" before I turned around.
But cryonics is a wash as far as taking risks goes. First, nobody is sure it will work, only that it gives you better odds than burial or cremation. Second, suppose you were sure it will work, becoming a fireman looks like a better deal -- die in the line of duty and be immortal anyway. Granted something might happen to destroy your brain instantly, but there's no reason to believe that's more likely than the scenario where you live to be old and your brain disintegrates cell by cell while the doctors prolong your agony as far as they can.
Cryonics aside, we should talk in probabilities, not certainties, and this is true of pretty much everything, including god, heliocentrism, etc. Second, cryonics may have a small chance of succeeding - say, 1% (number pulled out of thin air) - but that's still enormously better than the alternative 0% chance of being revived after dieing in any other way.
Did these two sentences' adjacency stick out to anybody else?
Guaranteed death places a limit on the value of my life to myself. Parents shield children with their bodies; Casey Jones happens more often. People run into burning buildings more often. (Suicide bombers happen more often, too, I realize.)
I'm not sure I interpret this the same way you do.
My understanding is that parents are willing to risk their lives for their children mostly because that's how we've been programmed by evolution by natural selection, not because we consciously or unconsciously feel that our death is putting a limit on the value of our...
Does anyone really expect that this population would not respond to its incentives to avoid more danger? Anecdotes aside; do you expect them to join the military with the same frequency, be firemen with the same frequency, to be doctors administering vaccinations in jungles with the same frequency?
Agreed--indeed, I suspect that one of the first steps to fundamentally altering the priorities of society may be the invention of methods to materially prolong life, such that it really does become an unspeakable tragedy to lose somebody permanently.
Humans risk their lives for less noble causes as well. Extreme sports and experimental air craft being some examples. I have a romantic streak in me that says that yes death is worse than life, but worrying overly about death also devalues life.
Should I pore over actuarial statistics and only select activities that do not increase my chance of death?
An alternate scenario: Omega forms an army and conscripts three people into it, and orders them to kill you. Omega then hands you a knife, with which you can certainly dispatch the unarmed, untrained conscripts who obediently follow their commander's wishes (despite vague apprehensions about war and violence and lack of a specific understanding for why they are to kill you).
Unfortunately, Omega is a very compelling commander and no surrender or retreat is possible. Its kill or be killed.
What do you do?
The debate already exists, for altruists who care about future generations as well: would you kill three people to stop an asteroid/global warming/monster of the week from killing more in future?
This is just the same question, made slightly more immediate and selfish by including yourself in that future.
ETA: To the extent that your post is asking about personal behaviour, you perhaps should have made that point clear. You appear to be making a general point about morality, and your "kill three people" hypothetical appears to distract from your actual point, and is probably a large part of why you're getting downvoted, as it's rather antagonistic. I'll keep the rest of my comment intact, as I believe it to be generally relevant.
This would be more constructive were it not self-centered, i.e. if the question were, "I'll grant so-and-so 100,000...
You turn around and see someone. She says, "Wait! You shouldn't kill me because ... "
UTILITARIAN
She says, "Wait! You shouldn't kill me because I'm signed up for cryonics too! This means that the total utility change will be negative if you kill me and the other people!"
VIRTUE ETHICS
"Wait! You shouldn't kill me because selfishly murdering others for personal gain is not a characteristic of a virtuous man!"
DEONTOLOGICAL ETHICS
"Wati! You shouldn't kill me because it's against the rules! Against the Categorical Imperativ...
If I kill the next three people, are they cryogenically preserved? Or is the next sentence implying an upper bound to the value of their life as opposed to contrasting with what would happen should you kill them?
I can also tell you that the next three people you see, should you fail to kill them, will die childless and will never sign up for cryonics. There is a knife on the ground behind you."
So, if you fail to kill them, they wind up childless without cryonics
Does this mean that if you do kill them that they will get Cryonics and Children?
Imagine Omega came to you and said, "Cryonics will work; it will be possible for you to be resurrected and have the choice between a simulation and a new healthy body, and I can guarantee you live for at least 100,000 years after that. However, for reasons I won't divulge, your surviving to experience this is wholly contingent upon you killing the next three people you see. I can also tell you that the next three people you see, should you fail to kill them, will die childless and will never sign up for cryonics. There is a knife on the ground behind you."
You turn around and see someone. She says, "Wait! You shouldn't kill me because ... "
What does she say that convinces you?
That's a quote from a comment in a post about cryonics. "I can't be more specific" is not doing this comment any favors, and overall the comment was rebutted pretty well. But I did try to imagine other these other valuable parts, and I realized something that remains unresolved for me.
Guaranteed death places a limit on the value of my life to myself. Parents shield children with their bodies; Casey Jones happens more often. People run into burning buildings more often. (Suicide bombers happen more often, too, I realize.)
I think this is a valuable part of humanity, and I think that an extreme "life is good, death is bad" view does do violence to it. You can argue we should effect a world that makes this willingness unnecessary, and I'll support that; but separate from making the willingness useless, eliminating that willingness does violence to our humanity. You can argue that our humanity is overrated and there's something better over the horizon, i.e. the cost is worth it.
But the incentives for saving 1+X many lives at the cost of your own just got lessened. How do you put a price on heaven? orthonormal suggests that we should rely on human irrationality here to keep us moral, that thankfully we are too stupid and slow to actually change the decisions we make after recognizing the expected value of our options has changed, despite the opportunity cost of these decisions growing considerably. I think this a) underestimates humans' ability to react to incentives and b) underestimates the reward the universe bestows on those who do react to incentives.
I don't see a good "solution" to this problem, other than to rely on cognitive dissonance to make this not seem as offensive as it is now in the future. The people for whom this presents a problem will eventually die out, anyway, as there is a clear advantage to favor it. I guess that's the (ultimately anticlimactic) takeaway: Morals change in the face of progress.
So, which do you favor more - your life, or identity?
EDIT: Well, it looks like this is getting fast-tracked for disappeared status. I think it's interesting that people seem to think I'm making a statement about a moral code. I'm not; I'm talking about incentives and what would happen, not what the right thing to do is.
Let's say Eliezer gets his wish and cryonics many, many parents sign up for cryonics and sign their children up for cryonics. Does anyone really expect that this population would not respond to its incentives to avoid more danger? Anecdotes aside; do you expect them to join the military with the same frequency, be firemen with the same frequency, to be doctors administering vaccinations in jungles with the same frequency? I don't think it's possible to say that with a straight face and mean it; populations respond to incentives, and the incentives just changed for that population.