A sequence on how to see through the disguises of answers or beliefs or statements, that don't answer or say or mean anything.
Mysterious Answers to Mysterious Questions is probably the most important core sequence in Less Wrong. Posts in the sequence are distributed from 28 Jul 07 to 11 Sep 07.
Not every belief that we have is directly about sensory experience, but beliefs should pay rent in anticipations of experience. For example, if I believe that "Gravity is 9.8 m/s^2" then I should be able to predict where I'll see the second hand on my watch at the time I hear the crash of a bowling ball dropped off a building. On the other hand, if your postmodern English professor says that the famous writer Wulky is a "post-utopian", this may not actually mean anything. The moral is to ask "What experiences do I anticipate?" not "What statements do I believe?"
Suppose someone claims to have a dragon in their garage, but as soon as you go to look, they say, "It's an invisible dragon!" The remarkable thing is that they know in advance exactly which experimental results they shall have to excuse, indicating that some part of their mind knows what's really going on. And yet they may honestly believe they believe there's a dragon in the garage. They may perhaps believe it is virtuous to believe there is a dragon in the garage, and believe themselves virtuous. Even though they anticipate as if there is no dragon.
You can have some fun with people whose anticipations get out of sync with what they believe they believe. This post recounts a conversation in which a theist had to backpedal when he realized that, by drawing an empirical inference from his religion, he had opened up his religion to empirical disproof.
A woman on a panel enthusiastically declared her belief in a pagan creation myth, flaunting its most outrageously improbable elements. This seemed weirder than "belief in belief" (she didn't act like she needed validation) or "religious profession" (she didn't try to act like she took her religion seriously). So, what was she doing? She was cheering for paganism — cheering loudly by making ridiculous claims.
When you've stopped anticipating-as-if something, but still believe it is virtuous to believe it, this does not create the true fire of the child who really does believe. On the other hand, it is very easy for people to be passionate about group identification - sports teams, political sports teams - and this may account for the passion of beliefs worn as team-identification attire.
Justifies the use of subjective probability estimates. Let's say you get paid to explain movements of the financial markets after the fact. You'd like to prepare your explanations for each way things could go in advance, and you can do your job better if you spend more time on preparing explanations for outcomes that are actually more likely. Being able to estimate probabilities could be useful even if you get paid to explain anything.
It was perfectly all right for Isaac Newton to explain just gravity, just the way things fall down - and how planets orbit the Sun, and how the Moon generates the tides - but not the role of money in human society or how the heart pumps blood. Sneering at narrowness is rather reminiscent of ancient Greeks who thought that going out and actually looking at things was manual labor, and manual labor was for slaves.
A hypothesis that forbids nothing, permits everything, and thereby fails to constrain anticipation. Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.
Absence of proof is not proof of absence. But absence of evidence is always evidence of absence. According to the probability calculus, if P(H|E) > P(H) (observing E would be evidence for hypothesis H), then P(H|~E) < P(H) (absence of E is evidence against H). The absence of an observation may be strong evidence or very weak evidence of absence, but it is always evidence.
If you are about to make an observation, then the expected value of your posterior probability must equal your current prior probability. On average, you must expect to be exactly as confident as when you started out. If you are a true Bayesian, you cannot seek evidence to confirm your theory, because you do not expect any evidence to do that. You can only seek evidence to test your theory.
Hindsight bias makes us overestimate how well our model could have predicted a known outcome. We underestimate the cost of avoiding a known bad outcome, because we forget that many other equally severe outcomes seemed as probable at the time. Hindsight bias distorts the testing of our models by observation, making us think that our models are better than they really are.
Hindsight bias leads us to systematically undervalue scientific findings, because we find it too easy to retrofit them into our models of the world. This unfairly devalues the contributions of researchers. Worse, it prevents us from noticing when we are seeing evidence that doesn't fit what we really would have expected. We need to make a conscious effort to be shocked enough.
People think that fake explanations use words like "magic", while real explanations use scientific words like "heat conduction". But being a real explanation isn't a matter of literary genre. Scientific-sounding words aren't enough. Real explanations constrain anticipation. Ideally, you could explain only the observations that actually happened. Fake explanations could just as well "explain" the opposite of what you observed.
In schools, "education" often consists of having students memorize answers to specific questions (i.e., the "teacher's password"), rather than learning a predictive model that says what is and isn't likely to happen. Thus, students incorrectly learn to guess at passwords in the face of strange observations rather than admit their confusion. Don't do that: any explanation you give should have a predictive model behind it. If your explanation lacks such a model, start from a recognition of your own confusion and surprise at seeing the result.
Although science does have explanations for phenomena, it is not enough to simply say that "Science!" is responsible for how something works -- nor is it enough to appeal to something more specific like "electricity" or "conduction". Yet for many people, simply noting that "Science has an answer" is enough to make them no longer curious about how it works. In that respect, "Science" is no different from more blatant curiosity-stoppers like "God did it!" But you shouldn't let your interest die simply because someone else knows the answer (which is a rather strange heuristic anyway): You should only be satisfied with a predictive model, and how a given phenomenon fits into that model.