TheOtherDave comments on The Bedrock of Morality: Arbitrary? - Less Wrong

16 Post author: Eliezer_Yudkowsky 14 August 2008 10:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (113)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: nshepperd 16 May 2012 04:33:11AM *  0 points [-]

I was going to ask how that relation came about, and how it behaves when your brain is computing counterfactuals... but even though those are good questions to consider, I realised that wouldn't really be that helpful. So...

What I'm really trying to say is that there's nothing special about morality at all. There doesn't have to be anything special about it for eudaimonia to be right and for pebble-sorting to not. It's just a concept, like every other concept. One that includes eudaimonia and excludes murder, and is mostly indifferent to pebble-sorting. Same as the concept prime includes {2, 3, 5, 7, ...} and excludes {4, 6, 8, 9, ...}.

The only thing remotely "special" about it is that it happens to be a human terminal value -- which is the only reason we care enough to talk about it in the first place. The only thing remotely special about the word "right" is that it happens to mean, in english, this morality-concept (which happens to be a human terminal value).

So, to say that "eudaimonia is right" is simply to assert that eudaimonia is included in this set of things that includes eudaimonia and excludes murder (in other words, yes, "X ∈ S2 implies X ∈ S1", where S2 is eudaimonia and S1 is morality). To say that what babyeaters value is right would be to assert that eating babies is included in this set ("X ∈ babyeating implies X ∈ S1", which is clearly wrong, since murderbabyeating).

Comment author: TheOtherDave 16 May 2012 01:36:33PM 0 points [-]

So, let us assume there exists some structure S3 in my head that implements my terminal values.

Maybe that's eudaimonia, maybe that's Hungarian goulash, I don't really know what it is, and am not convinced that it's anything internally coherent (that is, I'm perfectly prepared to believe that my S3 includes mutually exclusive states of the world).

I agree that when I label S3 "morality" I'm doing just what I do when I label S2 "eudaimonia" or label some other structure "prime". There's nothing special about the label "morality" in this sense. And if it turns out that you and I have close-enough S3s and we both label our S3s "morality," then we mean the same thing by "morality." Awesome.

If, OTOH, I have S3 implementing my terminal values and you have some different structure, S4, which you also label "morality", then we might mean different things by "morality".

Some day I might come to understand S3 and S4 well enough that I have a clear sense of the difference between them. At that point I have a lexical choice.

I can keep associating S3 with the label "morality" or "right" and apply some other label to S4 (e.g., "pseudo-morality" or "nsheppard's right" or whatever). You might do the same thing. In that case, if (as you say) the only thing that might be remotely special about the label "morality" or "right" is that it might happens to refer to human terminal value, then it follows that there's nothing special about that label in that case, since in that case it doesn't refer to a common terminal value. It's just another word.

Conversely, I can choose to associate the label "morality" or "right" with some new S5, a synthesis of S3 and S4... perhaps their intersection, perhaps something else. You might do the same thing. At that point we agree that "morality" means S5, even though S5 does not implement either of our terminal values.