You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheAncientGeek comments on Truth vs Utility - Less Wrong Discussion

1 Post author: Qwake 13 August 2014 05:45AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 13 August 2014 06:27:10AM *  7 points [-]

obtaining the truth is the rationalists' ultimate goal

Nope. It's an instrumental goal. We just believe it to be very useful, because in nontrivial situations it is difficult to find a strategy to achieve X without having true beliefs about X.

Are there scenarios in which a rationalist should actively try to avoid the truth to maximize their possible utility?

Omega tells you: "Unless you start believing in horoscopes, I will torture all humans to death." (Or, if making oneself believe something false is too difficult, then something like: "There is one false statement in your math textbook, and if you even find out which one it is, I will torture all humans to death." In which case I would avoid looking at the textbook ever again.)

Option 2 is Omega will answer one question on absolutely any subject truthfully pertaining to our universe with no strings attached. You can ask about the laws governing the universe, the meaning of life, the origin of time and space, whatever and Omega will give you a absolutely truthful, knowledgeable answer.

I guess it would depend on how much would I trust myself to ask a question what could bring me even more benefit than option 1. For example: "What is the most likely way that I could become Omega-powerful without losing my values? (Most likely = relative to my current situation and abilities.)" Because a lucky answer on this one could be even better than the first option. -- So it comes to an estimate about whether such lucky answer exists, what is my probability to follow the strategy successfully if I get the answer, and what is my probability to ask the question correctly. Which I admit I don't know.

Comment author: TheAncientGeek 13 August 2014 04:54:41PM *  0 points [-]

Where truth is a terminal goal, it is a terminal goal. The fact that it is a often a useful as a means to some other goal does not contradict that. Cf: valuing money for itself, or for what you can do with it.

Comment author: Vulture 13 August 2014 08:58:10PM 0 points [-]

Your statements are perfectly correct. You're probably being downvoted because people are assuming that you're talking about your own values, and they don't believe that you could "really" hold truth as your sole terminal goal.