You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheOtherDave comments on Greg Linster on the beauty of death - Less Wrong Discussion

6 Post author: Jonathan_Graehl 20 October 2011 04:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (67)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 21 October 2011 03:23:30PM 3 points [-]

Not necessarily. If I believed that my continued survival would cause the destruction of everything I valued, suicide would be a value-preserving option and analgesia would not be. More generally: if my values include anything beyond avoiding pain, analgesia isn't necessarily my best value-preserving option.

But, agreed, irreversibility of the sort we're discussing here is highly implausible. But we're discussing low-probability scenarios to begin with.

Comment author: pedanterrific 22 October 2011 02:58:41AM 4 points [-]

my continued survival would cause the destruction of everything I valued

This is a situation I hadn't thought of, and I agree that in this case, suicide would be preferable. But I hadn't got the impression that's what was being discussed - for one thing, if this were a real worry it would also argue against a two-thousand-year safety interval. I feel like the "Omega threatening to torture your loved ones to compel your suicide" scenario should be separated from the "I have no mouth and I must scream" scenario.

More generally: if my values include anything beyond avoiding pain, analgesia isn't necessarily my best value-preserving option.

True, but the problem with pain is that its importance in your hierarchy of values tends to increase with intensity. Now I'm thinking of a sort of dead-man's-switch where outside sensory information requires voluntary opting-in, and the suicide switch can only be accessed from the baseline mental state of total sensory deprivation, or an imaginary field of flowers, or whatever.

But, agreed, irreversibility of the sort we're discussing here is highly implausible. But we're discussing low-probability scenarios to begin with.

I was mostly talking about the irreversibility of suicide, actually. In an AIMS scenario, where I have every reason to expect my whole future to consist of total, mind-crushing suffering, I would still prefer "spend the remaining lifetime of the universe building castles in my head, checking back in occasionally to make sure the suffering hasn't stopped" to "cease to exist, permanently".

Of course, this is all rather ignoring the unlikelihood of the existence of an entity that can impose effectively infinite, total suffering on you but can't hack your mind and remove the suicide switch.