You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Wei_Dai comments on Under-acknowledged Value Differences - Less Wrong Discussion

47 Post author: Wei_Dai 12 September 2012 10:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (68)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 13 September 2012 01:29:47AM *  20 points [-]

Suppose Alice doesn't want Alice to die, Bob doesn't want Bob to die, and these are the only people and values in the world. Do you think these are not "different" values? (Note that I explicitly mentioned selfish values in the OP as an example of what I meant by "different values".) More importantly, wouldn't such values lead to the necessity of bargaining over how to solve problems that affect both of them?

Comment author: Pfft 13 September 2012 04:05:08PM 4 points [-]

This kind of situation is usually called "conflict of interest". I think using "value differences" is confusing terminology, at least to me it suggests some more fundamental difference such as sacredness vs avoiding harm.

Comment author: Wei_Dai 13 September 2012 04:40:30PM *  6 points [-]

Ah, that makes sense. (I was wondering why nyan_sandwich's comment was being upvoted so much when I already mentioned selfish values in the OP.) To be clear, I'm using "value differences" to mean both selfish-but-symmetric values and "more fundamental difference such as sacredness vs avoiding harm". (ETA: It makes sense to me because I tend to think of values in terms of utility functions that take world states as inputs.) I guess we could argue about which kind of difference is more important but that doesn't seem relevant to the point I wanted to make.

Comment author: evand 14 September 2012 02:53:57AM 0 points [-]

It seems like a relevant distinction in the FAI/CEV theory context, and indirectly relevant in the gender conflicts question. That is, it isn't first-order relevant in the latter case, but seems likely to become so in a thread that is attempting to go meta. Like, say, this one.

Comment author: [deleted] 13 September 2012 02:39:27AM 2 points [-]

good point on selfishness.

What I was getting at is that humans have mostly symmetric values such that they should not disagree over what type of society they want to live in, if they don't get to choose the good end of the stick.

Comment author: Wei_Dai 13 September 2012 06:47:23AM 19 points [-]

Even if people have symmetric values, the relevant facts are not symmetric. For example everyone values things that money can buy, but some people have much higher abilities to earn money in a free market economy, so there will be conflict over how much market competition to allow or what kind of redistributive policies to have.

if they don't get to choose the good end of the stick

I'm not sure what you mean by this. Are you saying something like, "if they were under a Rawlsian veil of ignorance"? But we are in fact not under a Rawlsian veil of ignorance, and any conclusions we make of the form "If I were under a Rawlsian veil of ignorance, I would prefer society to be organized thus: ..." are likely to be biased by the knowledge of our actual circumstances.

Comment author: wedrifid 13 September 2012 11:08:59AM 0 points [-]

What I was getting at is that humans have mostly symmetric values such that they should not disagree over what type of society they want to live in, if they don't get to choose the good end of the stick.

This seems wrong, except for extremely weak definitions of "mostly". People should definitely disagree about what type of society they want to live in, just a whole lot less than if they were disagreeing with something non-human.