Later I asked him about the efficacy of prayer and he said it worked as long as you weren't doing a test to see if it worked. How convenient.
This is the best example I have seen yet, but I am still not convinced that the problem is with anticipations not being guided by beliefs. He still anticipates something but is willing to amend the wrong side of the experiment when something goes weird.
But yeah, this is a much clearer example. I can think of a bunch of people I know who act like this.
The rest of this comment is nitpicking over something only slightly related.
I say, "So if there was a group of people with some disease, we should expect those who were prayed for to be more likely to get better, right?"
This sentence will trigger the conditioning I was talking about. This is the exact wrong way to talk to someone about the subject.
Either they say, "No" because they know about the studies that have been done, or they say, "Yes," I mention the studies, and they say something about how you can't put God to the test.
No it is not [different from the dragon example]. Their reaction is more emotionally charged than in the dragon example. The theists have a belief but anticipations guided by not-belief.
Those who say "No" because they know about the studies are not like the dragon example. They would have to say no before they knew about the studies. And, included in "studies," this means every single failed prayer from their own life.
If you found someone who had absolutely no good reason to doubt prayer they would expect the studies to show prayer works. A pre-dodge of the experiment is much more likely to point to previous encounters with experiments than anticipations hooked up to not-beliefs.
Those who say "Yes" are now amending their belief to fit the facts. This is not like the dragon example.
Another example: One of my friends is studying to be a Catholic priest. He believes in evolution. Of course I couldn't help but ask him if he thought (non-human) animals went to heaven. He said no. "Ah-ha!" I thought, "The trap is set!"
Stop trying to trap people. It is petty, rude, and just makes the world worse. Most people, even theists, are willing to talk about their beliefs if they don't feel defensive. People can smell a trap coming as soon as they see someone's face. As soon as they get defensive, the conversation becomes a war. This is bad.
Me: "So there had to be some point in evolution where two hairy proto-humans gave birth to a slightly less hairy human. Even though they only differed from each other as much as we differ from our parents, the proto-human parents didn't have souls and the child did. If the child went to heaven, he would ask God where his parents went.
Friend: "Yes."
Me: o_O
Really, the fact that you seem so surprised by this answer makes me think you have no idea what your friend believes. When your predictors to answers about technical questions are off enough to make you go o_O you may want to start looking at your predictors.
Sigh. I am sorry for jumping at you. I don't really have a good excuse, but I am sorry anyway.
Stop trying to trap people. It is petty, rude, and just makes the world worse.
But it's fun! At least it's fun between friends. Remember that my friend got the last laugh in my trap example. We both know we're not going to convince each other, but it's still fun to play argument chess.
Just to balance things out, I'll give you an example of a trap my friend set for me.
Me: (Starts to explain transhumanism. Quotes EY saying, "Life is good, death is bad. Health is good, sickness is bad." etc)
Friend: "If life is good and death is bad, then isn...
Carl Sagan once told a parable of someone who comes to us and claims: “There is a dragon in my garage.” Fascinating! We reply that we wish to see this dragon—let us set out at once for the garage! “But wait,” the claimant says to us, “it is an invisible dragon.”
Now as Sagan points out, this doesn’t make the hypothesis unfalsifiable. Perhaps we go to the claimant’s garage, and although we see no dragon, we hear heavy breathing from no visible source; footprints mysteriously appear on the ground; and instruments show that something in the garage is consuming oxygen and breathing out carbon dioxide.
But now suppose that we say to the claimant, “Okay, we’ll visit the garage and see if we can hear heavy breathing,” and the claimant quickly says no, it’s an inaudible dragon. We propose to measure carbon dioxide in the air, and the claimant says the dragon does not breathe. We propose to toss a bag of flour into the air to see if it outlines an invisible dragon, and the claimant immediately says, “The dragon is permeable to flour.”
Carl Sagan used this parable to illustrate the classic moral that poor hypotheses need to do fast footwork to avoid falsification. But I tell this parable to make a different point: The claimant must have an accurate model of the situation somewhere in their mind, because they can anticipate, in advance, exactly which experimental results they’ll need to excuse.
Some philosophers have been much confused by such scenarios, asking, “Does the claimant really believe there’s a dragon present, or not?” As if the human brain only had enough disk space to represent one belief at a time! Real minds are more tangled than that. There are different types of belief; not all beliefs are direct anticipations. The claimant clearly does not anticipate seeing anything unusual upon opening the garage door. Otherwise they wouldn’t make advance excuses. It may also be that the claimant’s pool of propositional beliefs contains the free-floating statement There is a dragon in my garage. It may seem, to a rationalist, that these two beliefs should collide and conflict even though they are of different types. Yet it is a physical fact that you can write “The sky is green!” next to a picture of a blue sky without the paper bursting into flames.
The rationalist virtue of empiricism is supposed to prevent us from making this class of mistake. We’re supposed to constantly ask our beliefs which experiences they predict, make them pay rent in anticipation. But the dragon-claimant’s problem runs deeper, and cannot be cured with such simple advice. It’s not exactly difficult to connect belief in a dragon to anticipated experience of the garage. If you believe there’s a dragon in your garage, then you can expect to open up the door and see a dragon. If you don’t see a dragon, then that means there’s no dragon in your garage. This is pretty straightforward. You can even try it with your own garage.
No, this invisibility business is a symptom of something much worse.
Depending on how your childhood went, you may remember a time period when you first began to doubt Santa Claus’s existence, but you still believed that you were supposed to believe in Santa Claus, so you tried to deny the doubts. As Daniel Dennett observes, where it is difficult to believe a thing, it is often much easier to believe that you ought to believe it. What does it mean to believe that the Ultimate Cosmic Sky is both perfectly blue and perfectly green? The statement is confusing; it’s not even clear what it would mean to believe it—what exactly would be believed, if you believed. You can much more easily believe that it is proper, that it is good and virtuous and beneficial, to believe that the Ultimate Cosmic Sky is both perfectly blue and perfectly green. Dennett calls this “belief in belief.”1
And here things become complicated, as human minds are wont to do—I think even Dennett oversimplifies how this psychology works in practice. For one thing, if you believe in belief, you cannot admit to yourself that you merely believe in belief. What’s virtuous is to believe, not to believe in believing; and so if you only believe in belief, instead of believing, you are not virtuous. Nobody will admit to themselves, “I don’t believe the Ultimate Cosmic Sky is blue and green, but I believe I ought to believe it”—not unless they are unusually capable of acknowledging their own lack of virtue. People don’t believe in belief in belief, they just believe in belief.
(Those who find this confusing may find it helpful to study mathematical logic, which trains one to make very sharp distinctions between the proposition P, a proof of P, and a proof that P is provable. There are similarly sharp distinctions between P, wanting P, believing P, wanting to believe P, and believing that you believe P.)
There are different kinds of belief in belief. You may believe in belief explicitly; you may recite in your deliberate stream of consciousness the verbal sentence “It is virtuous to believe that the Ultimate Cosmic Sky is perfectly blue and perfectly green.” (While also believing that you believe this, unless you are unusually capable of acknowledging your own lack of virtue.) But there are also less explicit forms of belief in belief. Maybe the dragon-claimant fears the public ridicule that they imagine will result if they publicly confess they were wrong.2 Maybe the dragon-claimant flinches away from the prospect of admitting to themselves that there is no dragon, because it conflicts with their self-image as the glorious discoverer of the dragon, who saw in their garage what all others had failed to see.
If all our thoughts were deliberate verbal sentences like philosophers manipulate, the human mind would be a great deal easier for humans to understand. Fleeting mental images, unspoken flinches, desires acted upon without acknowledgement—these account for as much of ourselves as words.
While I disagree with Dennett on some details and complications, I still think that Dennett’s notion of belief in belief is the key insight necessary to understand the dragon-claimant. But we need a wider concept of belief, not limited to verbal sentences. “Belief” should include unspoken anticipation-controllers. “Belief in belief” should include unspoken cognitive-behavior-guiders. It is not psychologically realistic to say, “The dragon-claimant does not believe there is a dragon in their garage; they believe it is beneficial to believe there is a dragon in their garage.” But it is realistic to say the dragon-claimant anticipates as if there is no dragon in their garage, and makes excuses as if they believed in the belief.
You can possess an ordinary mental picture of your garage, with no dragons in it, which correctly predicts your experiences on opening the door, and never once think the verbal phrase There is no dragon in my garage. I even bet it’s happened to you—that when you open your garage door or bedroom door or whatever, and expect to see no dragons, no such verbal phrase runs through your mind.
And to flinch away from giving up your belief in the dragon—or flinch away from giving up your self-image as a person who believes in the dragon—it is not necessary to explicitly think I want to believe there’s a dragon in my garage. It is only necessary to flinch away from the prospect of admitting you don’t believe.
If someone believes in their belief in the dragon, and also believes in the dragon, the problem is much less severe. They will be willing to stick their neck out on experimental predictions, and perhaps even agree to give up the belief if the experimental prediction is wrong.3 But when someone makes up excuses in advance, it would seem to require that belief and belief in belief have become unsynchronized.
1 Daniel C. Dennett, Breaking the Spell: Religion as a Natural Phenomenon (Penguin, 2006).
2 Although, in fact, a rationalist would congratulate them, and others are more likely to ridicule the claimant if they go on claiming theres a dragon in their garage.
3 Although belief in belief can still interfere with this, if the belief itself is not absolutely confident.