When we talk about rationality, we're generally talking about either epistemic rationality (systematic methods of finding out the truth) or instrumental rationality (systematic methods of making the world more like we would like it to be). We can discuss these in the forms of probability theory and decision theory, but this doesn't fully cover the difficulty of being rational as a human. There is a lot more to rationality than just the formal theories.
Strong emotions can be rational. A rational belief that something good happened leads to rational happiness. But your emotions ought not to change your beliefs about events that do not depend causally on your emotions.
Truth can be instrumentally useful and intrinsically satisfying.
(alternate summary:)
Why should we seek truth? Pure curiosity is an emotion, but not therefore irrational. Instrumental value is another reason, with the advantage of giving an outside verification criterion. A third reason is conceiving of truth as a moral duty, but this might invite moralizing about "proper" modes of thinking that don't work. Still, we need to figure out how to think properly. That means avoiding biases, for which see the next post.
(alternate summary:)
You have an instrumental motive to care about the truth of your beliefs about anything you care about.
Biases are obstacles to truth seeking caused by one's own mental machinery.
(alternate summary:)
There are many more ways to miss than to find the truth. Finding the truth is the point of avoiding the things we call "biases", which form one of the clusters of obstacles that we find: biases are those obstacles to truth-finding that arise from the structure of the human mind, rather than from insufficient information or computing power, from brain damage, or from bad learned habits or beliefs. But ultimately, what we call a "bias" doesn't matter.
Availability bias is a tendency to estimate the probability of an event based on whatever evidence about that event pops into your mind, without taking into account the ways in which some pieces of evidence are more memorable than others, or some pieces of evidence are easier to come by than others. This bias directly consists in considering a mismatched data set that leads to a distorted model, and biased estimate.
If you want to avoid the conjunction fallacy, you must try to feel a stronger emotional impact from Occam's Razor. Each additional detail added to a claim must feel as though it is driving the probability of the claim down towards zero.
We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will...
From the old discussion page:
Talk:Rationality: From AI to Zombies Summaries
Copy-pasted summaries from https://wiki.lesswrong.com/wiki/Less_Wrong/Article_summaries on 5/1/17. Eventually, summaries should probably be their own pages so that different categorization pages could exist. - Adam Zerner, 5/1/17