orthonormal comments on That Magical Click - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (400)
What the hell, I'll play devil's advocate.
Right now, we're all going to die eventually, so we can make tradeoffs between life and other values that we still consider to be essential. But when you take away that hard stop, your own life's value suddenly skyrockets - given that you can almost certainly, eventually, erase any negative feelings you have about actions done today, it becomes hard to justify not doing horrible things to save one's own life if one was forced to.
Imagine Omega came to you and said, "Cryonics will work; you will be resurrected and have the choice between a fleshbody and simulation, and I can guarantee you live for 10,000 years after that. However, for reasons I won't divulge, this is contingent upon you killing the next 3 people you see."
Well, shit. Let the death calculus begin.
You make a valid theoretical point, but as a matter of contingent fact, the only consequence I see is that people signed up will strongly avoid risks of having their brains splattered. Less motorcycle riding, less joining the army, etc.
Making people more risk-averse might indeed give them pause at throwing themselves in front of cars to save a kid, but:
Snap judgments are made on instinct at a level that doesn't respond to certain factors; you wouldn't be any less likely to react that way if you previously had the conscious knowledge that the kid had leukemia and wouldn't be cryopreserved.
In this day and age, risking your life for someone or something else with conscious premeditation does indeed happen even to transhumanists, but extremely rarely. The fringe effect of risk aversion among people signed up for cryonics isn't worth consigning all of their lives to oblivion.