Multiheaded comments on Ritual 2012: A Moment of Darkness - Less Wrong

35 Post author: Raemon 28 December 2012 09:09AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (136)

You are viewing a single comment's thread. Show more comments above.

Comment author: Multiheaded 25 December 2012 11:58:22AM *  8 points [-]

Humans' terminal values are adjusted in a way...

No proposal that includes these words is worth considering. There's no Schelling point between forcing people to die at some convenient age and be happy and thankful about it, and just painting smiles on everyone's souls. That's literally what terminal values are all about; you can only trade off between them, not optimize them away whenever it would seem expedient to!

If it's a terminal value for most people to suffer and grieve over the loss of individual life - and they want to suffer and grieve, and want to want to - a sensible utilitarian would attempt to change the universe so that the conditions for their suffering no longer occur, instead of messing with this oh-so-inconvenient, silly, evolution-spawned value. Because if we were to mess with it, we'd be messing with the very complexity of human values, period.

Comment author: shminux 25 December 2012 06:22:46PM 1 point [-]

There's no Schelling point between forcing people to die at some convenient age and be happy and thankful about it, and just painting smiles on everyone's souls.

A statement like that needs a mathematical proof.

If it's a terminal value for most people to suffer and grieve over the loss of individual life

"If" indeed. There is little "evolution-spawned" about it (not that it's a good argument to begin with, trusting the "blind idiot god"), a large chunk of this is cultural. If you dig a bit deeper into the reasons why people mourn and grieve, you can usually find more sensible terminal values. Why don't you give it a go.

Comment author: arborealhominid 31 December 2012 03:29:59AM 0 points [-]

I agree with what you're saying, but just to complicate things a bit: what if humans have two terminal values that directly conflict? Would it be justifiable to modify one to satisfy the other, or would we just have to learn to live with the contradiction? (I honestly don't know what I think.)

Comment author: Multiheaded 31 December 2012 03:44:23AM *  0 points [-]

(I honestly don't know what I think.)

Ah... If you or I knew what to think, we'd be working on CEV right now, and we'd all be much less fucked than we currently are.