Less Wrong/2006 Articles/Summaries

I have the same doubts about the worth of this summary-writing enterprise as before. Fleshing out stub articles is likely a better use of one's effort. --Vladimir Nesov 08:29, 25 July 2009 (UTC)

I agree. If you're looking for a big table to obsessively fill in, I think it would be more useful to fill in the "All Articles" tables. First fill in the column listing what concepts are introduced or discussed in the article, then make wiki pages for those concepts, possibly including a quote from that article. Also add a link to the article in the "Blog posts" section of the wiki page.

Also, I was considering setting up a script to check if the links on the wiki pages match the list in the "All Articles" pages, and to report which links still need to be added.

Also, I heard that someone already wrote short summaries of the articles, but I still haven't found these summaries.

Update: I found those summaries, but only the first 27 posts are actually summarized: http://rob-zahra.blogspot.com/2009/04/overcoming-bias-summaries.html

Also, Eliezer summarized many of his own articles. See, for example:

Here are the links to the "All articles" pages:

--PeerInfinity 14:27, 4 August 2009 (UTC)

Noted. No obsessiveness was involved -- due to a miscommunication, we believed Eliezer would have to spend time doing these summaries if others didn't do them, but this turns out not to be the case. Other than the sequences, though, I'm not aware of any existing summaries. --steven0461

Here are shortcuts to the summary pages:

Created by steven0461 at

(alternate summary:)

You have an instrumental motive to care about the truth of your beliefs about anything you care about.

You can pragmatically say "I don't know", but you rationally should have a probability distribution.

(alternate summary:)

An edited instant messaging conversation regarding the use of the phrase "I don't know". "I don't know" is a useful phrase if you want to avoid getting in trouble or convey the fact that you don't have access to privileged information.

End of 2006 articles

__NOTOC__

(alternate summary:)

Basic introduction of the metaphor and some of its consequences.

(alternate summary:)

You have an instrumental motive to care about the truth of your beliefs about anything you care about.

(alternate summary:)

You have an instrumental motive to care about the truth of your beliefs about anything you care about.

You can pragmatically say "I don't know", but you rationally should have a probability distribution.

(alternate summary:)

An edited instant messaging conversation regarding the use of the phrase "I don't know". "I don't know" is a useful phrase if you want to avoid getting in trouble or convey the fact that you don't have access to privileged information.

Rationality is a technique to be trained.

(alternate summary:)

(alternate summary:)

Rationality is a technique to be trained.

Truth can be instrumentally useful and intrinsically satisfying.

(alternate summary:)

(alternate summary:)

Truth can be instrumentally useful and intrinsically satisfying.

Biases are obstacles to truth seeking caused by one's own mental machinery.

(alternate summary:)

(alternate summary:)

Biases are obstacles to truth seeking caused by one's own mental machinery.

Use humility to justify further action, not as an excuse for laziness and ignorance.

(alternate summary:)

(alternate summary:)

Use humility to justify further action, not as an excuse for laziness and ignorance.

Factor in what other people think, but not symmetrically, if they are not epistemic peers.

(alternate summary:)

(alternate summary:)

Factor in what other people think, but not symmetrically, if they are not epistemic peers.

You can pragmatically say "I don't know", but you rationally should have a probability distribution.

(alternate summary:)

(alternate summary:)

You can pragmatically say "I don't know", but you rationally should have a probability distribution.

People respond in different ways to clear evidence they're wrong, not always by updating and moving on.

(alternate summary:)

(alternate summary:)

People respond in different ways to clear evidence they're wrong, not always by updating and moving on.

End of 2006 articles

(alternate summary:)

Rationality is a technique to be trained.

(alternate summary:)

Truth can be instrumentally useful and intrinsically satisfying.

(alternate summary:)

Biases are obstacles to truth seeking caused by one's own mental machinery.

(alternate summary:)

Use humility to justify further action, not as an excuse for laziness and ignorance.

(alternate summary:)

Factor in what other people think, but not symmetrically, if they are not epistemic peers.

(alternate summary:)

You can pragmatically say "I don't know", but you rationally should have a probability distribution.

(alternate summary:)

People respond in different ways to clear evidence they're wrong, not always by updating and moving on.

The Modesty Argument states that any two honest Bayesian reasoners who disagree should each take the other's beliefs into account and both arrive at a probability distribution that is the average of the ones they started with. Robin Hanson seems to accept the argument but Eliezer does not. Eliezer gives the example of himself disagreeing with a creationist as evidence for how following the modesty argument could lead to decreased individual rationality. He also accuses those who agree with the argument of not taking it into account when planning their actions.

An edited instant messaging conversation regarding the use of the phrase "I don't know". "I don't know" is a useful phrase if you want to avoid getting in trouble or convey the fact that you don't have access to privileged information.

A story about an underground society divided into two factions: one that believes that the sky is blue and one that believes the sky is green. At the end of the story, the reactions of various citizens to discovering the outside world and finally seeing the color of the sky are described.

There are many more ways to miss than to find the truth. Finding the truth is the point of avoiding the things we call "biases", which form one of the clusters of obstacles that we find: biases are those obstacles to truth-finding that arise from the structure of the human mind, rather than from insufficient information or computing power, from brain damage, or from bad learned habits or beliefs. But ultimately, what we call a "bias" doesn't matter.

There are good and bad kinds of humility. Proper humility is not being selectively underconfident about uncomfortable truths. Proper humility is not the same as social modesty, which can be an excuse for not even trying to be right. Proper scientific humility means not just acknowledging one's uncertainty with words, but taking specific actions to plan for the case that one is wrong.

Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing physical martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build such athe martial art.art of rationality. It remains to develop and especially to communicate techniques that apply this theoretical work introspectively to our own minds.

BiasesThere are many more ways to miss than find the truth. Finding the truth is the point of avoiding the things we call "biases", which form one kind of obstaclethe clusters of obstacles that we find: biases are those obstacles to truth-finding truth.that arise from the structure of the human mind, rather than from insufficient information or computing power, from brain damage, or from bad learned habits or beliefs. But ultimately, what we call a "bias" doesn't matter.

Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing standardphysical martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build such a martial art. It remains to develop and especially learn to communicate techniques that apply this theoretical work introspectively to our own minds.

Why should we seek truth? Pure curiosity is an emotion, but not therefore irrational. Instrumental value is another reason, with the advantage of giving an outside verification criterion. A third reason is conceiving of truth as a moral duty, but this might invite moralizing about "proper" modes of thinking that don't work. Still, we need to figure out how to think properly. That means avoiding biases, for which see the next post.

Biases are one kind of obstacle to finding truth.

Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing standard martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build such a martial art. It remains to develop and especially learn to communicate techniques tothat apply this sciencetheoretical work introspectively to our own minds.

Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing standard martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build such a martial art. It remains to develop and communicate techniques to apply this science introspectively to our own minds.