bgrah449 comments on Costs to (potentially) eternal life - Less Wrong

8 Post author: bgrah449 21 January 2010 09:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (107)

You are viewing a single comment's thread. Show more comments above.

Comment author: bgrah449 22 January 2010 09:21:20PM 3 points [-]

Imagine Omega said, "The person behind you will live for 30 seconds if you don't kill her. If you kill her, you will continue leading a long and healthy life. If you don't, you will also die in 30 seconds."

Do you say the same thing to Omega and continue enjoying your 30 seconds of life?

Comment author: Cyan 22 January 2010 10:11:34PM *  3 points [-]

For 30 seconds, I kill her. For an hour, we both die. I think my indecision point is around 15 minutes.

Comment author: Eliezer_Yudkowsky 22 January 2010 10:21:05PM 5 points [-]

...thank you for your honest self-reporting, but I do feel obliged to point out that this does not make any sense.

Comment author: Cyan 23 January 2010 01:33:11AM *  2 points [-]

I didn't think it through for any kind of logical consistency -- it's pure snap judgment. I think my instinct when presented with this kind of ethical dilemma is treat my own qalys (well, qalsecs) as far less valuable than those of another person. Or possibly I'm just paying an emotional surcharge for actually taking the action of ending another person's life. There was some sense of "having enough time to do something of import (e.g., call loved ones)" in there too.

Comment author: bgrah449 22 January 2010 10:18:48PM 1 point [-]

But isn't this time relative to lifespan? What if your entire lifespan were only 30 minutes?

Comment author: mattnewport 22 January 2010 09:38:01PM 4 points [-]

I think my reaction would be "fuck you Omega". If an omniscient entity decides to be this much of a douchebag then dying giving them the finger seems the only decent thing to do.

Comment author: bgrah449 22 January 2010 09:41:12PM 1 point [-]

My implied assumption was Omega was an excellent predictor, not an actor - I thought this was a standard assumption, but maybe it isn't.

Comment author: JulianMorrison 23 January 2010 02:10:26AM 2 points [-]

No difference. I won't buy my life with murder at any price. (Weighing one-against-many innocents is a different problem.)

And I'd be calling Omega a bastard because, as an excellent predictor, he'd know that, but decided to ruin my day by telling me anyway.

Comment author: bgrah449 23 January 2010 05:08:19AM 1 point [-]

Can you explain, then, how this is different then suicide, since your theft of her life is minimal, yet your sacrifice of your own life is large?

Comment author: JulianMorrison 23 January 2010 08:26:29PM 0 points [-]

It's not suicide, I'm just bumping into a moral absolute - I won't murder under those circumstances, so outcomes conditional on "I commit murder" are pruned from the search tree. If the only remaining outcome is "I die", then drat.

Comment author: Morendil 22 January 2010 09:42:28PM 1 point [-]

Showing that the original question had little to do with cryonics...

Comment author: bgrah449 22 January 2010 09:45:17PM 1 point [-]

This question is a highly exaggerated example to display the incentives, but cryonics subscribers will be facing choices of this kind, with much more subtle probabilities and payoffs.