Response to Man-with-a-hammer syndrome.
It's been claimed that there is no way to spot Affective Death Spirals, or cultish obsession with the One Big Idea of Everything. I'd like to posit a simple way to spot such error, with the caveat that it may not work for every case.
There's an old game called Two Truths and a Lie. I'd bet almost everyone's heard of it, but I'll summarize it just in case. A person makes three statements, and the other players must guess which of those statements is false. The statement-maker gets points for fooling people, people get points for not being fooled. That's it. I'd like to propose a rationalist's version of this game that should serve as a nifty check on certain Affective Death Spirals, runaway Theory-Of-Everythings, and Perfectly General Explanations. It's almost as simple.
Say you have a theory about human behaviour. Get a friend to do a little research and assert three factual claims about how people behave that your theory would realistically apply to. At least one of these claims must be false. See if you can explain every claim using your theory before learning which one's false.
If you can come up with a convincing explanation for all three statements, you must be very cautious when using your One Theory. If it can explain falsehoods, there's a very high risk you're going to use it to justify whatever prior beliefs you have. Even worse, you may use it to infer facts about the world, even though it is clearly not consistent enough to do so reliably. You must exercise the utmost caution in applying your One Theory, if not abandon reliance on it altogether. If, on the other hand, you can't come up with a convincing way to explain some of the statements, and those turn out to be the false ones, then there's at least a chance you're on to something.
Come to think of it, this is an excellent challenge to any proponent of a Big Idea. Give them three facts, some of which are false, and see if their Idea can discriminate. Just remember to be ruthless when they get it wrong; it doesn't prove their idea is totally wrong, only that reliance upon it would be.
Edited to clarify: My argument is not that one should simply abandon a theory altogether. In some cases, this may be justified, if all the theory has going for it is its predictive power, and you show it lacks that, toss it. But in the case of broad, complex theories that actually can explain many divergent outcomes, this exercise should teach you not to rely on that theory as a means of inference. Yes, you should believe in evolution. No, you shouldn't make broad inferences about human behaviour without any data because they are consistent with evolution, unless your application of the theory of evolution is so precise and well-informed that you can consistently pass the Two-Truths-and-a-Lie Test.
Psychohistorian doesn't say the idea isn't useful, just that reliance on it is incorrect. If the theory is "people mostly do stuff because of signalling", honestly, that's a pretty crappy theory. Once Signalling Guy fails this test, he should take that as a sign to go back and refine the theory, perhaps to
"People do stuff because of signalling when the benefit of the signal, in the environment of evolutionary adaptation, was worth more than its cost."
This means that making predictions requires estimating the cost and benefit of the behavior in advance, which requires a lot more data and computation, but that's what makes the theory a useful predictor instead of just another bogus Big Idea.
Not to point fingers at Freakonomics fans (not least because I'm guilty of this myself in party conversation) but it's real easy to look at a behavior that doesn't seem to make sense otherwise and say "oh, duh, signalling". The key is that the behavior doesn't make sense otherwise: it's costly, and that's an indication that, if people are doing it, there's a benefit you're not seeing. That technique may be helpful for explaining, but it's not helpful for predicting since, as you pointed out, it can explain anything if there's not enough cost/benefit information to rule it out.
People do all sorts of insane things for reasons other than signaling, though. Like because their parents did it, or because the behavior was rewarded at some point.
Of course, signaling behavior is often rewarded, due to it being successful signaling... which means it migh... (read more)