Vladimir_Nesov comments on If it were morally correct to kill everyone on earth, would you do it? - Less Wrong

-6 Post author: Bundle_Gerbe 30 January 2013 11:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (43)

You are viewing a single comment's thread.

Comment author: Vladimir_Nesov 31 January 2013 02:57:35AM *  21 points [-]

To the extent your question is, "Suppose X is the correct answer. Is X the correct answer?", X is the correct answer. Outside of that supposition it probably isn't.

Comment author: Bundle_Gerbe 31 January 2013 09:52:30AM 6 points [-]

I don't think that's what I'm asking. Here's an analogy. A person X comes to the conclusion fairly late in life that the morally best thing they can think of to do is to kill themselves in a way that looks like an accident and will their sizable life insurance policy to charity. This conclusion isn't a reducto ad absurdum of X's moral philosophy, even if X doesn't like it. Regardless of this particular example, it could presumably be correct for a person to sacrifice themselves in a way that doesn't feel heroic, isn't socially accepted, and doesn't save the whole world but maybe only a few far-away people. I think most people in such a situation (who managed not to rationalize the dilemma away) would probably not do it.

So I'm trying to envision the same situation for humanity as a whole. Is there any situation that humanity could face that would make us collectively say "Yeah doing Y is right, even though it seems bad for us. But the sacrifice is too great, we aren't going to do it". That is, if there's room for space between "considered morality" and "desires" for an individual, is there room for space between them for a species?

Comment author: Vladimir_Nesov 31 January 2013 04:01:48PM *  5 points [-]

Is there any situation that humanity could face that would make us collectively say "Yeah doing Y is right, even though it seems bad for us. But the sacrifice is too great, we aren't going to do it"

This is still probably not the question that you want to ask. Humans do incorrect things all the time, with excellent rationalizations, so "But the sacrifice is too great, we aren't going to do it" is not a particularly interesting specimen. To the extent that you think that "But the sacrifice is too great" is a relevant argument, you think that "Yeah doing Y is right" is potentially mistaken.

I guess the motivation for this post is in asking whether it is actually possible for a conclusion like that to be correct. I expect it might be, mainly because humans are not particularly optimized thingies, so it might be more valuable to use the atoms to make something else that's not significantly related to the individual humans. But again to emphasize the consequentialist issue: to the extent such judgment is correct, it's incorrect to oppose it; and to the extent it's correct to oppose it, the judgment is incorrect.

Comment author: Bundle_Gerbe 31 January 2013 10:45:53PM -1 points [-]

"But the sacrifice is too great" is a relevant argument, you think that "Yeah doing Y is right" is potentially mistaken.

I think I disagree with this. On a social and political level, the tendency to rationalize is so pervasive it would sound completely absurd to say "I agree that it would be morally correct to implement your policy but I advocate not doing it, because it will only help future generations, screw those guys." In practice, when people attempt to motivate each other in the political sphere to do something, it is always accompanied by the claim that doing that thing is morally right. But it is in principle possible to try to get people not to do something by arguing "hey this is really bad for us!" without arguing against it's moral rightness. This thought experiment is a case where this exact "lets grab the banana" position is supposed to be tempting.

Comment author: ArisKatsaris 31 January 2013 11:53:27PM 3 points [-]

People aren't motivated by morality alone -- people aren't required to do what they recognize to be morally correct.

e.g. a parent may choose their kid's life over the lives of a hundred other children. Because they care more about their own child -- not because they think it's the morally correct thing to do.

Our moral sense is only one of the many things that motivate us.

Comment author: Vladimir_Nesov 01 February 2013 01:30:55PM *  3 points [-]

Our moral sense is only one of the many things that motivate us.

I'm talking about extrapolated morality, which is not the same thing as moral sense (i.e. judgments accessible on human level without doing much more computation). This extrapolated morality determines what should motivate you, but of course it's not what does motivate you, and neither is non-extrapolated moral sense. In this sense it's incorrect to oppose extrapolated morality (you shouldn't do it), but you are in actuality motivated by other things, so you'll probably act incorrectly (in this sense).

Comment author: BerryPick6 01 February 2013 02:29:32PM 2 points [-]

Could you please point me in the direction of some discussion about 'extrapolated morality' (unless you mean CEV, in which case there's no need)?

Comment author: Vladimir_Nesov 01 February 2013 03:15:09PM *  0 points [-]

CEV for individuals is vaguely analogous to what I'm referring to, but I don't know in any detail what I mean.

Comment author: MugaSofer 01 February 2013 03:07:27PM -1 points [-]

Congratulations, you have successfully answered the title!

Now, on to the actual post ...