You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Houshalter comments on Open thread, Sep. 19 - Sep. 25, 2016 - Less Wrong Discussion

2 Post author: DataPacRat 19 September 2016 06:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (92)

You are viewing a single comment's thread. Show more comments above.

Comment author: Houshalter 21 September 2016 08:05:36PM 0 points [-]

Well what if suicide is illegal in the future? And even if it isn't, suicide is really hard to go through with. A lot of people have preferences that they would prefer not to be revived with brain damage, but people with brain damage do not commonly kill themselves.

Comment author: Dagon 22 September 2016 04:23:47PM 2 points [-]

I see this combination of expressed preference and actions (would prefer not to live with brain damage, but then actually choose to live with brain damage) as a failure of imagination and incorrect far-mode statements, NOT as an indication that the prior statement true, but was thwarted by some outside force.

Future-me instances have massively more information about what they're experiencing in the future than present-me has now. It's ludicrous for present-me to try to constrain future-me's decisions, and even more so to try to identify situations where present-me's wishes will be honored but future-me's decisions won't.

You can prevent adverse revival by cremation or burial (in which case you also prevent felicitous revival). If an evil regime wants you, any contract language is useless. If an individual-respecting regime considers your revival, future you would prefer to be revived and asked rather than being held to a past-you document that cannot predict the details of the current the situation very well.

Comment author: Lumifer 21 September 2016 08:50:33PM 1 point [-]

Well what if suicide is illegal in the future?

More to the point, what if suicide is impossible? It's not hard at all to prevent an em from committing suicide and, of course, if you have copies and backups, he can suicide all he wants...