You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanArmak comments on People v Paper clips - Less Wrong Discussion

-1 Post author: jdinkum 21 May 2012 04:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (20)

You are viewing a single comment's thread.

Comment author: DanArmak 21 May 2012 06:46:16PM *  2 points [-]

Values (utilities, goals, etc) are arational. Rationality, LW or otherwise, has nothing to say about "correctness" of terminal values. (Epistemic rationality - the study of how to discover objective truth - is valuable for most actual values which reference the objective, real world; but it is still only a tool, not necessarily valued for itself.)

Many LW posters and readers share some values, including human life; so we find it productive to discuss it. But no-one can or will tell you that you should or ought to have that value, or any other value - except as an instrumental sub-goal of another value you already have.

Your expression, "inherent values", is at best confusing. Values cannot be attributes purely of the valued things; they are always attributes of the tuple (valued thing, valuing agent). It doesn't make sense to say they are "inherent" in just one of those two parts.

Now, if you ask why many people here share this value, the answers are going to be of two kinds. First, why people in general have a high likelihood of holding this value. And second, whether this site tends to filter or select people based on their holding this value, and if so how and why it does that. These are important, deep, interesting questions that may allow for many complex answers, which I'm not going to try to summarize here. (A brief version, however, is that people care more about other people than about paperclips, because other people supply or influence almost all that a person tends to need or want in life, while paperclips give the average person little joy. I doubt that's what you're asking about.)

Comment author: Vladimir_Nesov 21 May 2012 10:45:06PM 3 points [-]

Rationality, LW or otherwise, has nothing to say about "correctness" of terminal values.

Correctness is the property of a description that accords with the thing being described. When you ask, "What are my terminal values?", you are seeking just such a description. A belief about terminal values can be correct or incorrect when it reflects or doesn't reflect the terminal values themselves. This is not fundamentally different from a belief about yesterday's weather being correct or incorrect when it reflects the weather correctly or incorrectly. Of course, the weather itself can't be "correct" or "incorrect".

Comment author: jdinkum 22 May 2012 03:15:05PM *  1 point [-]

I've been trying to work through Torture versus Dustspecks and The Intuitions Behind Utilitarianism and getting stuck...

It seems Values are arational, but there can be an irrational difference between what we believe our values are and what they really are.

Comment author: DanArmak 22 May 2012 04:11:20PM 0 points [-]

there can be an irrational difference between what we believe our values are and what they really are.

Certainly. We are not transparent to ourselves: we have subconscious and situation-dependent drives; we don't know in advance precisely how we'll respond to hypothetical situations, how much we'll enjoy and value them; we have various biases and inaccurate/fake memory issues which cause us to value things wrongly because we incorrectly remember enjoying them; our conscious selves self-deceive and are deceived by other brain modules; and so on.

Moreover, humans don't have well-defined (or definable) utility functions; our different values conflict.