Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Being levels above in [rationality] means doing rationalist practice 101 much better than others [just like] being a few levels above in fighting means executing a basic front-kick much better than others.
I fear not the man who has practiced 10,000 kicks once, but I fear the man who has practiced one kick 10,000 times.
- Bruce Lee
Recently, when Eliezer wanted to explain why he thought Anna Salamon was among the best rationalists he knew, he picked out one feature of Anna's behavior in particular:
I see you start to answer a question, and then you stop, and I see you get curious.
For me, the ability to reliably get curious is the basic front-kick of epistemic rationality. The best rationalists I know are not necessarily those who know the finer points of cognitive psychology, Bayesian statistics, and Solomonoff Induction. The best rationalists I know are those who can reliably get curious.
A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?
If you haven't seen this question before and you're like most people, your brain screams "10 cents!" But elementary algebra shows that can't be right. The correct answer is 5 cents. To get the right answer, I explained, you need to interrupt your intuitive judgment and think "No! Algebra."
A lot of rationalist practice is like that. Whether thinking about physics or sociology or relationships, you need to catch your intuitive judgment and think "No! Curiosity."
Most of us know how to do algebra. How does one "do" curiosity?
Below, I propose a process for how to "get curious." I think we are only just beginning to learn how to create curious people, so please don't take this method as Science or Gospel but instead as an attempt to Just Try It.
As with my algorithm for beating procrastination, you'll want to practice each step of the process in advance so that when you want to get curious, you're well-practiced on each step already. With enough practice, these steps may even become habits.
What would it look like if someone was truly curious — if they actually wanted true beliefs? Not someone who wanted to feel like they sought the truth, or to feel their beliefs were justified. Not someone who wanted to signal a desire for true beliefs. No: someone who really wanted true beliefs. What would that look like?
A truly curious person would seek to understand the world as broadly and deeply as possible. They would study the humanities but especially math and the sciences. They would study logic, probability theory, argument, scientific method, and other core tools of truth-seeking. They would inquire into epistemology, the study of knowing. They would study artificial intelligence to learn the algorithms, the math, the laws of how an ideal agent would acquire true beliefs. They would study modern psychology and neuroscience to learn how their brain acquires beliefs, and how those processes depart from ideal truth-seeking processes. And they would study how to minimize their thinking errors.
They would practice truth-seeking skills as a musician practices playing her instrument. They would practice "debiasing" techniques for reducing common thinking errors. They would seek out contexts known to make truth-seeking more successful. They would ask others to help them on their journey. They would ask to be held accountable.
They would cultivate that burning itch to know. They would admit their ignorance but seek to destroy it.
They would not flinch away from experiences that might destroy their beliefs. They would train their emotions to fit the facts.
They would update their beliefs quickly. They would resist the human impulse to rationalize.
But even all this could merely be a signaling game to increase their status in a group that rewards the appearance of curiosity. Thus, the final test for genuine curiosity is behavioral change. You would find a genuinely curious person studying and learning. You would find them practicing the skills of truth-seeking. You wouldn't merely find them saying, "Okay, I'm updating my belief about that" — you would also find them making decisions consistent with their new belief and inconsistent with their former belief.
Every week I talk to people who say they are trying to figure out the truth about something. When I ask them a few questions about it, I often learn that they know almost nothing of logic, probability theory, argument, scientific method, epistemology, artificial intelligence, human cognitive science, or debiasing techniques. They do not regularly practice the skills of truth-seeking. They don't seem to say "oops" very often, and they change their behavior even less often. I conclude that they probably want to feel they are truth-seeking, or they want to signal a desire for truth-seeking, or they might even self-deceivingly "believe" that they place a high value on knowing the truth. But their actions show that they aren't trying very hard to have true beliefs.
Dare I say it? Few people look like they really want true beliefs.
A crucial question towards the beginning of any research project is, why should my group succeed in elucidating an answer to a question where others may have tried and failed?
Here's how I'm going about dividing the possible worlds, but I'm interested to see if anyone has any other strategies. First, the whole question is conditional on nobody having already answered the particular question you're interested in. So, you first need an exhaustive lit review, that should scale in intensity based on how much effort you expect to actually expend on the project. Still nothing? These are the remaining possibilities:
1) Nobody else has ever thought of your question, even though all of the pieces of knowledge needed to formulate it have been known for years. If the field has many people involved, the probability of this is vanishingly small and you should systematically disabuse yourself of your fantasies if you think like this often. Still... if true, the prognosis: a good sign.
2) Nobody else has ever thought of your question, because it wouldn't have been ask-able without pieces of knowledge that were discovered just recently. This is common in fast-paced fields and it's why they can be especially exciting. The prognosis: a good sign, but work quickly!
3) Others have thought of your question, but didn't think it was interesting enough to devote serious attention to. We should take this seriously, as how informed others choose to allocate their attention is one of our better approximations to real prediction markets. So, the prognosis: bad sign. Figure out whether you can not only answer your question but validate its usefulness / importance, too.
4) Others have thought of your question, thought it was interesting, but have never tried to answer it because of resource or tech restraints, which you do not face. Prognosis: probably the best-case scenario.
5) Others have thought of your question and run the relevant tests, but failed to get any consistent / reliable results. It'd be nice if there were no publication bias but of course there is--people are much more likely to publish statistically significant, positive results. Due to this bias, it is sometimes hard to tell precisely how many dead skeletons and dismembered brains line your path, and because of this uncertainty you must assign this possibility a non-zero probability. The prognosis: a bad sign, but do you feel lucky?
6) Others have thought of your question, run the relevant tests, and failed to get consistent / reliable results, but used a different method than the one you will use. Your new tech might clear up some of the murkiness, but it's important here to be precise about which specific issues your method solves and which it doesn't. The prognosis: all things equal, a good sign.
These are the considerations we make when we decide whether to pursue a given topic. But even if you do choose to pursue the question, some of these possibilities have policy recommendations for how to proceed. For example, using new tech, even if it's not necessarily demonstrably better in all cases, seems like a good idea given the possibility of #6.
Your friend tells you that a certain rock formation on Mars looks a lot like a pyramid, and that maybe it was built by aliens in the distant past. You scoff, and respond that a lot of geological processes can produce regular-looking rocks, and in all the other cases like this closer investigation has revealed the rocks to be completely natural. You think this whole conversation is silly and don't want to waste your time on such nonsense. Your friend scoffs and asks:
"Where's your sense of mystery?"
You respond, as you have been taught to do, that your sense of mystery is exactly where it should be, among all of the real non-flimflam mysteries of science. How exactly does photosynthesis happen, what is the relationship between gravity and quantum theory, what is the source of the perturbations in Neptune's orbit? These are the real mysteries, not some bunkum about aliens. And if we cannot learn to take joy in the merely real, our life will be empty indeed.
But do you really believe it?
I loved the Joy in the Merely Real sequence. But it spoke to me because it's one of the things I have the most trouble with. I am the kind of person who would have much more fun reading about the Martian pyramid than about photosynthesis.
And the one shortcoming of Joy in the Merely Real was that it was entirely normative, and not descriptive. It tells me I should reserve my sense of mystery for real science, but doesn't explain why it's so hard to do so, or why most people never even try.
So what is this sense of mystery thing anyway?
"The first virtue is curiosity."
—The Twelve Virtues of Rationality
As rationalists, we are obligated to criticize ourselves and question our beliefs... are we not?
Consider what happens to you, on a psychological level, if you begin by saying: "It is my duty to criticize my own beliefs." Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write". Mark Twain said: "A classic is something that everyone wants to have read and no one one wants to read." Criticizing yourself from a sense of duty leaves you wanting to have investigated, so that you'll be able to say afterward that your faith is not blind. This is not the same as wanting to investigate.
This can lead to motivated stopping of your investigation. You consider an objection, then a counterargument to that objection, then you stop there. You repeat this with several objections, until you feel that you have done your duty to investigate, and then you stop there. You have achieved your underlying psychological objective: to get rid of the cognitive dissonance that would result from thinking of yourself as a rationalist, and yet knowing that you had not tried to criticize your belief. You might call it purchase of rationalist satisfaction—trying to create a "warm glow" of discharged duty.
View more: Next