Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique.
Carl Sagan had a slogan: "Extraordinary claims require extraordinary evidence." He would say this phrase and then explain how, when someone claims something extraordinary (i.e. something for which we have a very low probability estimate), they need correspondingly stronger evidence than if they'd made a higher-likelihood claim, like "I had a sandwich for lunch." We can talk about this very precisely, in terms of Bayesian updating and conditional probability, but Sagan was able to get a lot of this across to random laypeople in about a minute. Maybe two minutes.
What techniques for rationality can be explained to a normal person in under five minutes? I'm looking for small and simple memes that will make people more rational, on average. Here are some candidates, to get the discussion started:
Candidate 1 (suggested by DuncanS): Unlikely events happen all the time. Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle -- but events occur all the time that are just as unlikely. If you look at how many highly unlikely things could happen, and how many chances they have to happen, then it's obvious that we're going to see "miraculous" coincidences, purely by chance. Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.
Candidate 2: Admitting that you were wrong is a way of winning an argument. (The other person wins, too.) There's a saying that "It takes a big man to admit he's wrong," and when people say this, they don't seem to realize that it's a huge problem! It shouldn't be hard to admit that you were wrong about something! It shouldn't feel like defeat; it should feel like success. When you lose an argument with someone, it should be time for high fives and mutual jubilation, not shame and anger. The hard part of retraining yourself to think this way is just realizing that feeling good about conceding an argument is even an option.
Candidate 3: Everything that has an effect in the real world is part of the domain of science (and, more broadly, rationality). A lot of people have the truly bizarre idea that some theories are special, immune to whatever standards of evidence they may apply to any other theory. My favorite example is people who believe that prayers for healing actually make people who are prayed for more likely to recover, but that this cannot be scientifically tested. This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured. I think that if you pointed out a few examples of this kind of special pleading to people, they might start to realize when they're doing it.
Anti-candidate: "Just because something feels good doesn't make it true." I call this an anti-candidate because, while it's true, it's seldom helpful. People trot out this line as an argument against other people's ideas, but rarely apply it to their own. I want memes that will make people actually be more rational, instead of just feeling that way.
This was adapted from an earlier discussion in an Open Thread. One suggestion, based on the comments there: if you're not sure whether something can be explained quickly, just go for it! Write a one-paragraph explanation, and try to keep the inferential distances short. It's good practice, and if we can come up with some really catchy ones, it might be a good addition to the wiki. Or we could use them as rationalist propaganda, somehow. There are a lot of great ideas on Less Wrong that I think can and should spread beyond the usual LW demographic.
Here's something that comes up in many, many discussions of climate change and anything else where a lot of arguments come from models or simulations: sometimes you have to do the math to make a valid (counter-)argument.
Example:
A: ...And so you, see, as CO2 increases, the mean global temperature will also increase.
B: That's bullshit, and here's why: as CO2 increases, there will be more photosynthesis -- and the increased plant growth will consume all that extra CO2.
Another example (the one that motivated this comment):
A: And so, as long as the bus is carrying six or more passengers, it'll be more efficient than the passenger-equivalent number of cars.
B: That's bullshit! Buses are ten times heavier than cars, so it's got to be ten or more bus passengers.
People often think that in discussions of quantitative phenomena, it's enough to make arguments based purely on directional drivers/phenomena, when really the magnitudes of those drivers are hugely important. Of course there are negative feedbacks, countervailing forces, etc., but (a) usually they're already dealt with in the original model and so B isn't telling anyone anything new, and (b) magnitude matters.
I believe that in the first example, "A" is supposed to be right. In the second example, is "A" or "B" supposed to be right? B is doing the math, but assumes that fuel required is proportional to mass, which is wrong, due at least to engine size and air resistance. (Consider the (mass x miles)/gallon of a 2005 RST1000 Futura motorcycle (565 x 42 = 23730), a Smart car (1808lb x 40mpg = 72320), a 2010 Honda Civic DX-VP (2709 x 36 = 97524), and a 2010 Toyota Camry SE (3329 x 33 = 109857). All MPG are EPA highway estimates.)
By d... (read more)