selylindi comments on We Don't Have a Utility Function - Less Wrong

43 [deleted] 02 April 2013 03:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread. Show more comments above.

Comment author: TimS 02 April 2013 03:09:11PM *  2 points [-]

About two years ago, it very much felt like freedom from authority was a terminal value for me. Those hated authoritarians and fascists were simply wrong, probably due to some fundamental neurological fault that could not be reasoned with. The very prototype of "terminal value differences".

And yet here I am today, having been reasoned out of that "terminal value", such that I even appreciate a certain aesthetic in bowing to a strong leader.

On what basis do you assert you were "reasoned out" of that position? For example, what about your change of mind causes you to reject a conversion (Edit: not conversation) metaphor?

If that was a terminal value, I'm afraid the term has lost much of its meaning to me. If it was not, if even the most fundamental-seeming moral feelings are subject to argument, I wonder if there is any coherent sense in which I could be said to have terminal values at all.

Yes, that's the problem with the conversion metaphor. If reasoning does not cause changes in terminal values, then it seems like terminal values are not real for some sense of real. Yet moral anti-realism feels so incredibly unintuitive.


Edit: The other way you might respond is that you have realized that you still value freedom, but have recently realized it is not a terminal value. But that makes the example less useful in figuring out how actual terminal values work.

Comment author: selylindi 16 April 2013 09:33:17PM *  0 points [-]

Perhaps then we should speak of what we want in terms of "terminal values"? For example, I might say that it is a terminal value of mine that I should not murder, or that freedom from authority is good.

But what does "terminal value" mean? Usually, it means that the value of something is not contingent on or derived from other facts or situations, like for example, I may value beautiful things in a way that is not derived from what they get me. The recursive chain of valuableness terminates at some set of values.

... if even the most fundamental-seeming moral feelings are subject to argument, I wonder if there is any coherent sense in which I could be said to have terminal values at all.

TimS mentioned moral anti-realism as one possibility. I have a favorable opinion of desire utilitarianism (search for pros and cons), which is a system that would be compatible with another possibility: real and objective values, but not necessarily any terminal values.

By analogy, such a situation would be a description for moral values like epistemological coherentism (versus foundationalism) describes knowledge. The mental model could be a web rather than a hierarchy. At least it's a possibility -- I don't intend to argue for or against it right now as I have minimal evidence.