The first virtue is curiosity.
As rationalists, we are obligated to criticize ourselves and question our beliefs . . . are we not?
Consider what happens to you, on a psychological level, if you begin by saying: “It is my duty to criticize my own beliefs.” Roger Zelazny once distinguished between “wanting to be an author” versus “wanting to write.” Mark Twain said: “A classic is something that everyone wants to have read and no one wants to read.” Criticizing yourself from a sense of duty leaves you wanting to have investigated, so that you’ll be able to say afterward that your faith is not blind. This is not the same as wanting to investigate.
This can lead to motivated stopping of your investigation. You consider an objection, then a counterargument to that objection, then you stop there. You repeat this with several objections, until you feel that you have done your duty to investigate, and then you stop there. You have achieved your underlying psychological objective: to get rid of the cognitive dissonance that would result from thinking of yourself as a rationalist, and yet knowing that you had not tried to criticize your belief. You might call it purchase of rationalist satisfaction—trying to create a "warm glow" of discharged duty.
Afterward, your stated probability level will be high enough to justify your keeping the plans and beliefs you started with, but not so high as to evoke incredulity from yourself or other rationalists.
When you’re really curious, you’ll gravitate to inquiries that seem most promising of producing shifts in belief, or inquiries that are least like the ones you’ve tried before. Afterward, your probability distribution likely should not look like it did when you started out—shifts should have occurred, whether up or down; and either direction is equally fine to you, if you’re genuinely curious.
Contrast this to the subconscious motive of keeping your inquiry on familiar ground, so that you can get your investigation over with quickly, so that you can have investigated, and restore the familiar balance on which your familiar old plans and beliefs are based.
As for what I think true curiosity should look like, and the power that it holds, I refer you to “A Fable of Science and Politics” in the first book of this series, Map and Territory. The fable showcases the reactions of different characters to an astonishing discovery, with each character’s response intended to illustrate different lessons. Ferris, the last character, embodies the power of innocent curiosity: which is lightness, and an eager reaching forth for evidence.
Ursula K. LeGuin wrote: “In innocence there is no strength against evil. But there is strength in it for good.”1 Innocent curiosity may turn innocently awry; and so the training of a rationalist, and its accompanying sophistication, must be dared as a danger if we want to become stronger. Nonetheless we can try to keep the lightness and the eager reaching of innocence.
As it is written in “The Twelve Virtues of Rationality”:
If in your heart you believe you already know, or if in your heart you do not wish to know, then your questioning will be purposeless and your skills without direction. Curiosity seeks to annihilate itself; there is no curiosity that does not want an answer.
There just isn’t any good substitute for genuine curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But you can’t produce curiosity just by willing it, any more than you can will your foot to feel warm when it feels cold. Sometimes, all we have is our mere solemn vows.
So what can you do with duty? For a start, we can try to take an interest in our dutiful investigations—keep a close eye out for sparks of genuine intrigue, or even genuine ignorance and a desire to resolve it. This goes right along with keeping a special eye out for possibilities that are painful, that you are flinching away from—it’s not all negative thinking.
It should also help to meditate on “Conservation of Expected Evidence.” For every new point of inquiry, for every piece of unseen evidence that you suddenly look at, the expected posterior probability should equal your prior probability. In the microprocess of inquiry, your belief should always be evenly poised to shift in either direction. Not every point may suffice to blow the issue wide open—to shift belief from 70% to 30% probability—but if your current belief is 70%, you should be as ready to drop it to 69% as raise it to 71%. You should not think that you know which direction it will go in (on average), because by the laws of probability theory, if you know your destination, you are already there. If you can investigate honestly, so that each new point really does have equal potential to shift belief upward or downward, this may help to keep you interested or even curious about the microprocess of inquiry.
If the argument you are considering is not new, then why is your attention going here? Is this where you would look if you were genuinely curious? Are you subconsciously criticizing your belief at its strong points, rather than its weak points? Are you rehearsing the evidence?
If you can manage not to rehearse already known support, and you can manage to drop down your belief by one tiny bite at a time from the new evidence, you may even be able to relinquish the belief entirely—to realize from which quarter the winds of evidence are blowing against you.
Another restorative for curiosity is what I have taken to calling the Litany of Tarski, which is really a meta-litany that specializes for each instance (this is only appropriate). For example, if I am tensely wondering whether a locked box contains a diamond, then rather than thinking about all the wonderful consequences if the box does contain a diamond, I can repeat the Litany of Tarski:
If the box contains a diamond,
I desire to believe that the box contains a diamond;
If the box does not contain a diamond,
I desire to believe that the box does not contain a diamond;
Let me not become attached to beliefs I may not want.
Then you should meditate upon the possibility that there is no diamond, and the subsequent advantage that will come to you if you believe there is no diamond, and the subsequent disadvantage if you believe there is a diamond. See also the Litany of Gendlin.
If you can find within yourself the slightest shred of true uncertainty, then guard it like a forester nursing a campfire. If you can make it blaze up into a flame of curiosity, it will make you light and eager, and give purpose to your questioning and direction to your skills.
1Ursula K. Le Guin, The Farthest Shore (Saga Press, 2001).
I've only been reading OB for a month or thereabouts myself, but I had a little trawl through the archives looking for interesting things.
If epistemologists-as-a-class take any particular stand on whether a general willingness to doubt all one's beliefs is courageous, then that's the first I've heard of it. But I'm not an expert on epistemology, still less on epistemologists, so maybe that wouldn't be too surprising. Anyway: What epistemological ideas, generally rejected by epistemologists these days, are being relied on by those who say things like "It is courageous to be prepared to revise any of your ideas, if the balance of evidence turns out to be against them"?
(I expect a lot of epistemologists would insist that you probably have some ideas for which you'll never be able to find yourself in that position, because they're so firmly built into the structure of your brain or of the reasoning processes you're using. But that's quite separate from whether a willingness to doubt anything you do get good evidence against is either courageous or wise, and doesn't seem to me to have anything much to do with what Eliezer is saying here.)
Isn't your explanation of why Bricker dismisses "the option [I] advocate" just "If I adopt this policy then I'll have to do a lot of judgement-suspending, and I don't want to"? Or does he (or do you) have some specific problems in mind, that one would run into by doing this? (Being uncertain about some questions one would rather be confident about isn't, in my view, a "problem".)
For the avoidance of doubt: I am not proposing (though I think there are contributors here who would) that when considering any philosophical problem it's illegitimate to have opinions of one's own that differ from the majority view among philosophers. (Or among the very best philosophers, or whatever.) But I do think it's a sign of something probably wrong if you find yourself in disagreement with others who (at least on the face of it) are better placed to understand the matter clearly, and don't have anything to say in favour of your position other than that it seems right to you. Because when you do that, you're basically appealing to the quality of your intuition, and ex hypothesi those disagreeing others have intuitions likely to be at least as good as yours.