Reply to: The Mystery of the Haunted Rationalist
Followup to: Don't Believe You'll Self-Deceive
Should a rationalist ever find themselves trying hard to believe something?
You may be tempted to answer "No", because "trying to believe" sounds so stereotypical of Dark Side Epistemology. You may be tempted to reply, "Surely, if you have to try hard to believe something, it isn't worth believing."
But Yvain tells us that - even though he knows damn well, on one level, that spirits and other supernatural things are not to be found in the causal closure we name "reality" - and even though he'd bet $100 against $10,000 that an examination would find no spirits in a haunted house - he's pretty sure he's still scared of haunted houses.
Maybe it's okay for Yvain to try a little harder to accept that there are no ghosts, since he already knows that there are no ghosts?
In my very early childhood I was lucky enough to read a book from the children's section of a branch library, called "The Mystery of Something Hill" or something. In which one of the characters says, roughly: "There are two ways to believe in ghosts. One way is to fully believe in ghosts, to look for them and talk about them. But the other way is to half-believe - to make fun of the idea of ghosts, and talk scornfully of ghosts; but to break into a cold sweat when you hear a bump in the night, or be afraid to enter a graveyard."
I wish I remembered the book's name, or the exact quote, because this was one of those statements that sinks in during childhood and remains a part of you for the rest of the life. But all I remember was that the solution to the mystery had to do with hoofbeats echoing from a nearby road.
So whenever I found something that I knew I shouldn't believe, I also tried to avoid half-believing; and I soon noticed that this was the harder part of the problem. In my childhood, I cured myself of the fear of the dark by thinking: If I'm going to think magically anyway, then I'll pretend that all the black and shapeless, dark and shadowy things are my friends. Not quite the way I would do it nowadays, but it worked to counteract the half-belief, and in not much time I wasn't thinking about it at all.
Considerably later in my life, I realized that I was having a problem with half-believing in magical thinking - that I would sometimes try to avoid visualizing unpleasant things, from half-fear that they would happen. If, before walking through a door, I visualized that a maniac had chosen that exact moment to sneak into the room on the other side, and was armed and waiting with a knife - then I would be that little bit more scared, and look around more nervously, when entering the room.
So - being, at this point, a bit more sophisticated - I visualized a spread of probable worlds, in which - in some tiny fraction - a knife-wielding maniac had indeed chosen that moment to lurk behind the door; and I visualized the fact that my visualizing the knife-wielding maniac did not make him the tiniest bit more likely to be there - did not increase the total number of maniacs across the worlds. And that did cure me, and it was done; along with a good deal of other half-superstitions of the same pattern, like not thinking too loudly about other people in case they heard me.
Enforcing reflectivity - making ourselves accept what we already know - is, in general, an ongoing challenge for rationalists. I cite the example above because it's a very direct illustration of the genre: I actually went so far as to visualize the (non-)correlation of map to territory across possible worlds, in order to get my object-level map to realize that the maniac really really wasn't there.
It wouldn't be unusual for a rationalist to find themselves struggling to rid themselves of attachment to an unwanted belief. If we can get out of sync in that direction, why not the other direction? If it's okay to make ourselves try to disbelieve, why not make ourselves try to believe?
Well, because it really is one of the classic warning signs of Dark Side Epistemology that you have to struggle to make yourself believe something.
So let us then delimit, and draw sharp boundaries around the particular and rationalist version of striving for acceptance, as follows:
First, you should only find yourself doing this when you find yourself thinking, "Wait a minute, that really is actually true - why can't I get my mind to accept that?" Not Gloriously and Everlastingly True, mind you, but plain old mundanely true. This will be gameable in verbal arguments between people - "Wait, but I do believe it's really actually true!" - but if you're honestly trying, you should be able to tell the difference internally. If you can't find that feeling of frustration at your own inability to accept the obvious, then you should back up and ask whether or not it really is obvious, before trying to make your mind do anything. Can the fool say, "But I do think it's completely true and obvious" about random silly beliefs? Yes, they can. But as for you, just don't do that. This is to be understood as a technique for not shooting off your own foot, not as a way of proving anything to anyone.
Second, I call it "striving to accept", not "striving to believe", following billswift's suggestion. Why? Consider the difference between "I believe people are nicer than they are" and "I accept people are nicer than they are". You shouldn't be trying to raise desperate enthusiasm for a belief - if it doesn't seem like a plain old reality that you need to accept, then you're using the wrong technique.
Third and I think most importantly - you should always be striving to accept some particular argument that you feel isn't sinking in. Strive to accept "X implies Y", not just "Y". Strive to accept that there are no ghosts because spirits are only made of material neurons, or because the supernatural is incoherent. Strive to accept that there's no maniac behind the door because your thoughts don't change reality. Strive to accept that you won't win the lottery because you could make one distinct statement every second for a year with every one of them wrong, and not be so wrong as you would be by saying "I will win the lottery."
So there is my attempt to draw a line between the Dark Side and the Light Side versions of "trying to believe". Of course the Light Side also tends to be aimed a bit more heavily at accepting negative beliefs than positive beliefs, as is seen from the three examples. Trying to think of a positive belief-to-accept was difficult; the best I could come up with offhand was, "Strive to accept that your personal identity is preserved by diassembly and reassembly, because deep down, there just aren't any billiard balls down there." And even that, I suspect, is more a negative belief, that identity is not disrupted.
But to summarize - there should always be some particular argument, that has the feeling of being plain old actually true, and that you are only trying to accept, and are frustrated at your own trouble in accepting. Not a belief that you feel obligated to believe in more strongly and enthusiastically, apart from any particular argument.
My personal experience is that self-talk is only useful insofar as you're using it to lead yourself to a sensory experience of some kind. For example, asking "What if [desired state of affairs] were true?" is far more useful than simply asserting it so. The former at least invites one to imagine something specific.
Repetition also isn't as useful as most people seem to think. Your brain has little problem updating information immediately, if there's sufficient emotion involved... and the "aha" of insight (i.e. reducing the modeling complexity required to explain your observations) counts as an emotion. If you have to repeat it over and over again -- and it's not a skill you're practicing -- you're doing something wrong.
All of these terms -- self-talk, visualization, and pretending -- are also examples of Unteachable Excellence and Guessing The Teacher's Password. You can equally use the term to describe something useful (like asking good questions) or something ridiculous (like affirmations). The specific way in which you talk, visualize, or pretend is of critical importance.
For example, if you simply visualize some scripted scenario, rather than engaging in inquiry with yourself, you are wasting your time. The "near" brain needs to generate the details, not the "far" brain, or else you don't get the right memories in context.
I'll admit to a bit of hand-waving on that last part -- I know that when my clients visualize, self-talk, or pretend in "scripted" ways (driven by conscious, logical, and "far" thinking), my tests show no change in belief or behavior, and that when they simply ask what-if questions and observe their mind's response, the tests show changes. My guess is that this has something to do with the "reconsolidation" theory of memory: that activating a memory is required in order to change it. But I'm more of a pragmatist than a theorist in this area.
My experience with self-talk is via Cognitive behavioral therapy, as described in "Feeling Good". There are a lot of concrete and specific ways of adjusting one's emotions to match one's deliberative beliefs in that book.
"You can equally use the term to describe something useful or something ridiculous." I agree completely. I think the success of religious memes has a lot to do with their systematic advocacy of self-talk, visualization and imitation of guru figures.
You say "When my clients visualize, self-talk in scripted ways, my ... (read more)