Today's post, Why truth? And... was originally published on 27 November 2006. A summary (taken from the LW wiki):

Why should we seek truth? Pure curiosity is an emotion, but not therefore irrational. Instrumental value is another reason, with the advantage of giving an outside verification criterion. A third reason is conceiving of truth as a moral duty, but this might invite moralizing about "proper" modes of thinking that don't work. Still, we need to figure out how to think properly. That means avoiding biases, for which see the next post.

Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was The Martial Art of Rationality, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
9 comments, sorted by Click to highlight new comments since:

The sequence re-run posts could use an index in the post title so that at a glance their order can be determined.

[-][anonymous]20

I was just lamenting that the answer to "why truth" was not very well spelled out in Post 1. Of course EY is a step ahead of me.

The official answers: because we're curious, because it's necessary to get certain things done, and (more weakly endorsed) because it's the right thing to do. I thought this recent remark

I admit that the original Less Wrong sequences did not heavily emphasize the benefits for everyday life (as opposed to solving ridiculously hard scientific problems).

from EY here was interesting. Did the focus on movement building and self improvement come later?

A couple of skeptical thoughts about those reasons:

  1. Some people feel curiosity less often and less intensely than others, just as some people feel angry less often -- why truth for the uncurious? Conversely it's possible to sate curiosity by studying and accepting lies and not truths. Ron Hubbard and Saint Augustine strike me as very curious folks.

  2. I can't dispute the instrumental value of a rational approach to building airplanes. But the instrumental value of a rational approach to religion is less clear to me.

Some people feel curiosity less often and less intensely than others, just as some people feel angry less often -- why truth for the uncurious? Conversely it's possible to sate curiosity by studying and accepting lies and not truths. Ron Hubbard and Saint Augustine strike me as very curious folks.

I agree with this. The post annoyed me with how it treated curiosity as one of the main reason to seek truth, when there are so many better reasons, at least in the current era. My attitude toward curiosity is similar to the attitude that Eliezer expresses, for example, here toward intellectual competitivism. (See in particular the part where he says "I’m happy to accept my ignoble motivations as a legitimate part of myself, so long as they’re motivations to learn math".)

I can't dispute the instrumental value of a rational approach to building airplanes. But the instrumental value of a rational approach to religion is less clear to me.

Waste of time, money, and lives that could be devoted to useful things like science and rational charity. Could be justified by egoism if not for the potential for greatly prolonged lifespans/greatly improved lives from technology, but people aren't egoists and religion prevents people from learning about exactly those benefits that destroy even the egoistic reasons.

I can't dispute the instrumental value of a rational approach to building airplanes. But the instrumental value of a rational approach to religion is less clear to me.

It is difficult to estimate the instrumental value of anything when terminal values are unknown/unspecified. If one starts with absolute certainly that a particular religion is true, then rationality may have little instrumental value with regards to religion for that person (except to help them realize that absolute certainty is problematic). On the other hand, if one is deciding whether to join or leave a religion, then epistemic rationality would likely be extremely useful in making the right choice.

[-][anonymous]10

Some comments:

*The phrase "rational emotion" would probably be very misleading in a casual conversation or an argument, and you might have to go to great lengths to explain what it means.

*I suspect the Spock meme is so problematic because it implies that emotion clouds rational judgment (which is true) but goes on to assert that "rational" people need to purge themselves of emotions. When people hear the former, they complete the pattern and think of the latter. As Eliezer points out, there's no reason why we should purge emotions that follow from beliefs.

I have found some success explaining it thus:

"Emotions should be based on facts. (Pause for exclamation) If you feel afraid of touching the stove, and the stove is hot, then you should feel afraid. If you feel afraid of touching the stove, and the stove is cold, then you shouldn't feel afraid. Facts, determining your emotions."

This won't work on anyone inclined to question what "should" means, though, but it doesn't trip nearly as many alarm bells as saying "the Way".

I don't find this anti-Spock argument very convincing. If the stove is hot, you just shouldn't touch it, there's really no reason to be afraid. Emotions were useful because they elicited the appropriate behaviour in the hunter-gatherer environment, but now we can simply manage, barring extreme situations, to do the proper things.

You can clearly point at rational behaviours and distinguish them from irrational ones, and you can call 'rational' an emotion which induces the rational behaviour. But that doesn't mean that the emotions, per se, are necessary to that effect.

A spock can really functions as a proper and winning rationalist... but we, of course, are no vulcanians.

[-][anonymous]00

Is there an anti-Spock argument you do find convincing?

I am very suspicious of claiming that morality implies "intrinsic value" to the pursuit of truth. I am not really sure whether or not "intrinsic value" actually exists. It would seem to me that whatever the reasons one has for pursuing truth about a certain topic of discovery is reason enough and we need not posit mysterious words like "intrinsic value". The moral justification for pursuing truth would be that knowlege leads to truth belief which lead to good intentions and actions and the world is far better off if people are acting on true beliefs, therefore we have good reasons to be truth-seekers.