You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Vulture comments on The Cryonics Strategy Space - Less Wrong Discussion

24 Post author: Froolow 24 April 2014 04:11PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vulture 30 April 2014 01:12:26AM 1 point [-]

You could make a consequential case for it, of course

Certainly true, and disturbing, especially for those of us who feel that consequentialism is in some way "correct". Since far-future people are virtually guaranteed to have radically different values than us, and likely would have the ability to directly modify our (to them frighteningly evil) values, wouldn't we (per murder-Gandhi) want to spread a deontological system that forbids tampering with other people's values, even if we feel that in general consequentialism based on our current society's values is more morally beneficial? That is, would we prefer for some small spark of our moral system to survive into the distant future, at the expense of it being lost in the here and now?