I increasingly think that for rationalists to use religion as their intellectual sparring partner is a form of laziness. I could barely be more confident than I am that religion is a load of bunkum, and so when I engage with it I know I am vastly better armed and virtually guaranteed victory, at least in my own eyes. This makes it very attractive to spar with as far as my ego is concerned, with almost a zero chance of discovering my own mistakes or the flaws in my own rationality. Religion really could not do more to flag itself as mistaken from a rational point of view - we should move on and start taking on harder targets.
I agree that religion is an easy target, and not much of a test for rationality.
However I would be wary about the choice of harder targets, because some of them (politics especially) are things that a lot of people feel strongly about, which could make things messy here.
I think politics is a subject where everybody should try to overcome his own biases, but I would also prefer if OvercomingBias and LessWrong stayed politically neutral.
I think this fails when you think of it on Bayesian grounds.
If we want to use the large number of intelligent religious people as evidence for the truth of religion, we need to show that P(intelligent believers|religion true) is greater than P(intelligent believers|religion false); that intelligent people are more likely to believe religion if it's true than if it's false.
But at most one religion can be true. Therefore, all other religions are false. But lots of people, many intelligent, believe both (for example) Christianity and Hinduism. If (for the sake of argument) we're wondering whether Christianity is true, then we cannot explain all the smart Hindus without admitting that people are just as likely to believe a false religion as a true one.
But if people are just as likely to believe a false religion as a true one, then lots of people believing a religion is no evidence that it is true.
There is an argument for shifting views towards general religious feelings, since that can't be disproven so easily. And I do shift my views a little in that direction. But remember that you can't double-shift. You have to shift them from the place they would be if no one believed in religion at all. IE, imagine a world where everyone was a scientific materialist, and imagine the credence you would give to this new hypothesis someone just dreamt up that the world was created by supernatural beings six thousand years ago. Then you can multiply that credence by whatever factor you want to use for the high level of belief people have in it. But the original credence is so vanishingly low that even the extra believers can't save it.
Also, re: Einstein and God. Richard Dawkins answers this better than I do. I suggest you read his work on the subject. The summary is that Einstein liked to use the word "God" as a metaphor for "physics", but wasn't a believer per se. Newton was, but he was also a believer in alchemy...
But I'd still like to thank you for bringing up this topic. You're using rationalist methods to support religion, which is exactly how all religious people should be doing it and which is something rationalists should take seriously. I'm a bit sad you're getting downvoted as much as you are, compared to some inane comments about how stupid all religious people must be that tend to be voted up. I predict most atheists will be overconfident in their rejection of religion, simply because most people are overconfident in any politically charged topic they feel strongly about. It's good to occasionally have to listen to intelligent rationalist arguments in favor or religion to avoid an echo chamber effect.
“You're using rationalist methods to support religion”
Thank you very much for the compliment. However, it is totally undeserved. Being essentially an atheist (well, agnostic to be precise), supporting religion was the last thing on my mind. What I really wanted to do is to test how rational and intelligent people, which I hoped would be overrepresented on this forum, would react to arguments that go against their preferred view.
It is interesting that everyone seems to assume that I am a religious person myself, though I thought the contrary should be pretty obvious from the post title. Personally, I have yet to meet people who would call their beliefs “irrational”.
I call plenty of my beliefs irrational... at least during the window between where I realize I have them, find the basis for them, and then fix them. ;-)
"It is interesting that everyone seems to assume that I am a religious person myself, though I thought the contrary should be pretty obvious from the post title."
Guilty as charged. http://en.wikipedia.org/wiki/Fundamental_attribution_error
It is interesting that everyone seems to assume that I am a religious person myself,
I didn't. I assumed you were making a valuable point but using a poor argument while doing so.
"But at most one religion can be true. Therefore, all other religions are false."
This is the single biggest reason why I can't understand why anyone believing any specific religion. Dawkin's classic response when asked "What if you're wrong?" is simply to repeat the question.
The question religious people should ask themselves is this: of all of the thousands of religions there have been in human history, what is the probability that you just happened to be born in a family who just happened to believe in what just happens to be the correct interpretation of what just happens to be the specific version of what just happens to be the only holy book which gets you into heaven?
This point is made more valid by the fact that almost all religions are very clear that they are an exclusive club - if you pick the wrong religion (or even the wrong denomination within a religion) you don't get Eternal Life.
This point is made more valid by the fact that almost all religions are very clear that they are an exclusive club - if you pick the wrong religion (or even the wrong denomination within a religion) you don't get Eternal Life.
This is not universally true of any of the major contemporary religions (as practiced, and according to the scripture of at least some), and probably not true of most of the thousands of religions there have been. In fact, I believe many tribal pagan religions don't claim the nonexistence of their neighbors' deities - they're just "not our way," so to speak.
Also, nitpickingly, not every religious person has retained the faith they were raised with.
Of course, the general point is good.
So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds”.
I have another answer. If my beliefs differ from the majority, then I would take the crowd's beliefs as potential counter-evidence. Rather than leading me to conform to the views of the crowd, it will lead me to scrutinize my beliefs extra hard for potential errors in reasoning. If I find some, then I might agree with the crowd. Otherwise, I will assume that the crowd is wrong.
It also depends on the level of speculation involved and how good the quality of evidence is. For dealing with a speculative question where I don't have much evidence, the beliefs of the crowd might be some of the best quality evidence I have, so I will adjust my assessment in their direction. But if I have some better evidence that I know the crowd doesn't have, if I can point out massive gaping holes in the crowd's reasoning, or if I can see cognitive biases that would better explain the crowd's views instead of those views being caused by truth, then the evidence of the crowd would not be a large factor in my thinking.
For instance, multiple studies have shown that investors who are more confident of their ability to beat the market receive lower returns on their investments. This overconfidence penalty applies even to the supposed experts, such as fund managers.
The ability of experts vs. the crowd varies in different domains. Yes, there are domains where the experts are overconfident, or even charlatans (finance is one of the best examples). Yet it's also easy to find domains where the experts do know a lot better than the crowd. I really don't trust what the crowd thinks about quantum mechanics or brain surgery.
Remember, IQ is normally distributed. Only a couple percent of people have IQ 2+ standard deviations from the mean (130). For certain types of problems that require high intelligence to solve, the views of the masses will indeed be worthless on average. Same thing with problems that require certain types of specialized knowledge.
In many cases, it really is possible that the crowd is just being stupid, again. To ignore this, and to fail to acknowledge that it can be true even in cases where you arrogantly disagree with the crowd, is not rationality but a weird form of cognitive hyper-humility. The crowd isn't necessarily wrong because we are smarter than it, but I think it's possible to make rational arguments that the crowd is wrong and we are right.
Consider that many important philosophical and scientific ideas that the crowd acknowledges as true today (e.g. heliocentrism) were once pioneered by an expert going against the crowd. I think history would have gone worse if guys like Galileo (sincerely) said, "hey guys, I kinda thought that the Earth went around the Sun, but since so many of you believe otherwise, I think I might be a bit too overconfident..."
Smart people going against the wisdom of the crowd is a necessary, but not sufficient condition of the advancement of human knowledge.
Would they be willing to shift their views to accommodate the chance that their own reasoning powers are insufficient to get the right answer?
No. I'm confident that there are no rational reasons to believe in God, and I've scrutinized my reasoning plenty, and the arguments for belief in God. I have much better explanations for the crowd's beliefs rather than God existing and causing those beliefs. Although I would like to hear Newton or Einstein's justifications for theism. I'm a fallibilist, so I recognize that I could be wrong.
I agree with you that there is a danger in over-confidence, including for the smart people, but I think there is also a danger in excessive cognitive humility, especially for smart people. To paraphrase a famous quote: All that is necessary for the triumph of stupidity is that smart people do nothing.
Thanks for the thoughtful reply. I must admit that even though I expected that some “rationalists” would be just as defensive as religious folks about their views, looking at my karma now I realize that I grossly underestimated their number. That’s a good lesson to me for lecturing other people about overconfidence.
" For certain types of problems that require high intelligence to solve, the views of the masses will indeed be worthless on average. "
I generally share your opinion in case the debated issues at stake are devoid of emotional charge. However, once we move to our deeply cherished views high IQ people can be just as good at self-deception as anybody else, if not better. As Orwell said, “some ideas are so stupid that only intellectuals could believe them."
"So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds."
As Bryan Caplan's "Myth of the Rational Voter" pretty convincingly shows, the wisdom of crowds is of little use when the costs of irrationality are low. It's true in democracy: voting for an irrational policy like tariffs has almost no cost, because a single vote almost never matters. The benefit of feeling good about voting for what you like to believe in is big, though.
Similarly, in religious matters, the costs to the individual are usually slight compared to the benefits: the cost of, say, weekly attendance of a church provides group bonding and social connections. [There are certainly places, and there were times, when costs were vastly higher - daily attendance, alms tax, etc. But the benefits were proportionately bigger, as your group would be key to defending your life.]
In either case, trusting the wisdom of crowds seems to be a dangerous idea: if the crowd is systematically biased, you're screwed.
The "Wisdom of the Crowds" that is embedded in the markets comes not from an aggregation of the expressed beliefs of individual investors. The wisdom can be extracted from the behavior of the individuals, behaviors that they do not always understand and often do not have the ability or inclination to express.
When it comes to human religious beliefs I similarly look at behavior as the strongest indicator. What I see, and what the research into human behavior seems to suggest, is that the 'collective wisdom' decision would be to have a superficial religious belief that you maintain for social purposes, while minimising the effect that such beliefs have on behavior in the world.
When it comes to people that have some level of expertise in the subject, I am comfortable that I have accomodated their positions at least passably. As for allowing the majority uninformed belief to sway me, I was raised a believer and remained devout into my adult years. That is altogether too much accounting, to my mind.
Or that this belief is not restricted to the uneducated but shared by many famous scientists, including Newton and Einstein?
Fame. Experts completely out of their field claimed as banners. Insulting.
Einstein was not a theist. From a private letter he wrote that was auctioned off last year:
the word God is for me nothing more than the expression and product of human weaknesses, the Bible a collection of honorable but still primitive legends which are nevertheless pretty childish.
He was spiritual in the sense that many scientists are spiritual -- Sagan-like awe at the wonders of the universe -- but that's a far cry from believing in God.
Not that it makes much difference whether 1 scientist did or did not believe in God, but it's unfortunate that this particular falsehood never seems to die.
"The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds”."
No, no. God, no. Such a strategy only works if enough people aren't following it, but rather a genuinely effective course. If a minimum threshold is crossed, people end up following other followers, without anyone actually leading. The phrase "the blind leading the blind" was coined for a reason.
If too many people try to piggyback on others, no one gets anywhere. Somebody has to do the thinking. It might as well be us.
Overconfidence can be defeated by limiting our confidence to what we can actually demonstrate, and being doubtful and uncertain about anything we can't.
One of the features of relying upon wisdom of the crowds is that you don't ask the same questions that the members of the 'crowd' are. So it's not really 'the blind leading the blind'.
For example, (an oversimplified econ 101 sort of example) if you don't care what kind of wood your pencils are made out of, you can just buy the cheapest wood. The price of the wood is based on the supply of different kinds of wood, what other people are making out of wood, and demand in the marketplace for different kinds of manufactured wood products (amongst other things). If you were making pencils out of cedar, and then everyone decides they want bookcases made of cedar, then the price of cedar will rise and you can start making it out of some other wood. The result is that you end up using wood in the most efficient fashion, taking into consideration the desires of people buying all sorts of things and the need for wood in different areas of society. But this is all accomplished without anyone being concerned about using wood most efficiently; the pencil-maker just wants to make pencils as cheaply as possible, the people buying pencils encourage him to do so by buying the cheapest pencils that meet their needs, and the other industries have no need to concern themselves over pencils at all.
Wish I had more time to make this clearer.
Markets!
The pencils made out of more expensive wood will sell better if that's what people want; otherwise, they'll buy the ones made out of cheaper wood because they're cheaper.
If you try to sell the cheaper pencils at the more expensive pencils' price, you lose the information about what people want and meanwhile someone undercuts you and eats your lunch.
I think you have a good point regarding religion and I'm currently seeking for a miracle in order to falsify my current atheist mindset. So far the christians I asked where unwilling to provide convincing evidence. But I'm still looking.
On a personal note I was a christian for over 10 years and was a member of lots of churches and I've never seen a miracle that could not be explained by science. Overcomingbias.com has helped me to see the truth and become an atheist.
You should only look for miracles if you expect to find them. And naturalistic understanding of the world insists that you shouldn't expect that.
This view, taken to its logical extreme, would invalidate science. You're supposed to put some effort into looking for things that could disprove your current worldview.
You may well have an irrational tendency to not expect things that turn out to be real. There is an unfathomable number of things that you don't expect to find, so the choosing of your next dragon needs to be performed based on some kind of evidence, like a known bias.
P.S. Nick's interpretation is correct: in "expect" I included very unlikely but still barely plausible claims.
Why assume Roland assigns epsilon probability to miracles? If he's sufficiently uncertain about atheism so as to want to look for miracles, I'd say going out and looking is the best possible thing he can be doing. It's too much to expect that he go from being a Christian to having epsilon probability of God right away. If a few years after being a church-going Christian he's now at 5% probability of God, I wouldn't say he's doing anything wrong. Even Jeffreyssai doesn't say every decision should be made in less than a second. Just don't take thirty years.
It could be argued that because of the importance of miracles (a single one would prove some form of religion, religion is a very important issue) it's worth keeping an eye out for them even at such low levels of expectation that you'd give up looking for, say, the Higgs boson.
If thirty years from now he's still looking for miracles, that would be a problem, but the thought-mode that one should look for evidence if one is uncertain wouldn't be the issue. The issue would be whatever it was that was keeping his probability distribution at 5%.
If a few years after being a church-going Christian he's now at 5% probability of God, I wouldn't say he's doing anything wrong.
I disagree. Changes of opinion about conclusions should be swift and decisive, which doesn't mean that in the same movement you should wipe out from your mind the understanding and experience gained from the previous, invalidated position. Changing your mind swiftly, while keeping the background that allows to regain mastery in the disbelieved position in case it returns to plausibility, seems to bring the best of both practices. This is the attitude I have towards Robin Hanson's modesty argument: you should change your conclusion towards the universally accepted one, but still be ready to change it back in an informed manner.
The costs of acting on the overly inert and therefore wrong conclusion project on everything you do, while the cost of still thinking of and remembering the background that lead to the very unlikely conclusion are low enough to keep them. At one point you stop actively researching your unlikely idea, at the next point you stop thinking about it, and some clicks further you forget it entirely. Just don't be overconfident about the conclusion, thinking you know how overwhelmingly, 10 to the minus 1000 unlikely it is, when in fact it turns out to be correct.
Good point. The thing is though, that a lot of christians that I know claim they have witnessed miracles. So there might be something to it.
"So what an expert rationalist should do to avoid this overconfidence trap?"
You mean, how should one overcome bias? Be less wrong, if you will? You've come to the right place. David Balan had a post that didn't receive enough attention over at OB: http://www.overcomingbias.com/2008/09/correcting-bias.html This comment roughly paraphrases the points I made there.
If we can identify a bias, presumably we can also identify the optimal outcome that would happen in the absence of such bias. There are two ways to achieve this, and I will post them in separate comments so they can be voted down separately.
If you can identify an optimal outcome for a situation in which you are likely to be biased, you can constrain yourself ahead of time such that you can't give in to the bias. The classical example is Odysseus tying himself to the mast to avoid giving in to the sirens' song. Tie yourself to the mast at a rational moment, so you don't err in a biased one.
Applying this to your example, if you are indeed trying to maximize returns on a portfolio, the last place you should make buy/sell decisions is on a loud, testosterone-laden trading floor. It's better to decide ahead of time on a model based on which one is going to manage: "If a tech stock has x earnings but y insider ownership, then buy." Stick to your rules [perhaps bind yourself through some automatic limit that you cannot circumvent] as long as they seem to be tending to achieve your goal [maximize return]. Revisit them if they don't seem to - but again, revisit them at a time you are not likely to be biased.
Agreed.
The classical example is Odysseus tying himself to the mast to avoid giving in to the sirens' song
You just stole that from the banner at OB
That people are overconfident doesn't imply that they are more overconfident when they go against the majority opinion. Investors that think too much they can beat the market receive lower return, but the market is probably smarter than the majority opinion.
I would even say that the pressure to conform is so strong that most people will be much more overconfident when they go with the wisdom of the crowd. Could it be that as participants to a blog dedicated to rationality we are more overconfident and we resist the common opinion too much? It could, but I'd like to see some evidence.
Nonreligious people do need to adjust for the large number of intelligent religious people. But consider that the adjustment is made to the prior plausibility of religion before you know that billions of people believe it.
Imagine living in an alternate world where everyone was an atheist. An archaeologist digs up a somewhat contradictory manuscript with a few historical errors that ends with a description of a man coming back from the dead and ascending into the sky. As far as the archaeologist can tell, no one ever believed this manuscript and it's not even clear the writer believed it himself. In this world, I wouldn't give it a second thought. I wouldn't even give it a first thought.
In our world, the millions of people who say this story has changed their lives and they have witnessed miracles and so on provides evidence for this manuscript. It's not as much evidence as we might naively think, because we know that people are likely to have mystical experiences and believe they have witnessed miracles even with no reason (for example, everyone agrees Greek paganism is false, but Greek pagans still record miracles and mystical experiences). However, I am willing to admit that the large degree of belief in religion makes religion an order of magnitude more likely than it would be if there weren't any such belief (I wouldn't invoke the wisdom of crowds here, but I might invoke some attenuated form of Aumann).
But consider how vanishingly small the plausibility of religion was in the world where no one believed it, and even an order of magnitude isn't enough to save it.
But I would like to thank you for bringing up this topic. Looking at the Sanity Waterline thread, some people seem much too willing to suspend their usual good sense in the service of bashing religion just a little bit more. And I think it's a general rule that it's always healthier to show people arguments for things they disagree with (whether they're good or bad) than to keep them watching a constant parade of things confirming their previous beliefs.
part 2: "So what an expert rationalist should do to avoid this overconfidence trap?"
Apologies for flooding the comments, but I wanted to separate the ideas so they can be discussed separately. The question is how to avoid overconfidence, and bias in general. Picking up from last time:
If we can identify a bias, presumably we can also identify the optimal outcome that would happen in the absence of such bias. If we can do that, can't we also constrain ourselves in such a way that we can achieve the optimal outcome despite giving in to the bias? For example, David Balan referenced his own softball game, in which he swings a half-second to early and has been unable to tell himself "swing .5 seconds later" with any success. My advice to him was to change his batting stance such that the biased swing still produces the optimal outcome.
This idea of "changing your stance" is especially useful in situations in which you can't constrain yourself in other ways: in situations in which you know you will be biased and can't avoid making decisions in such situations. David would have to avoid the game altogether to correct his bias, but that's akin to saying that the dead don't commit bias: by adjusting his stance he can stay in the game AND have the right outcome.
In contrast to constraining your possible set of actions to unbiased ones [as I suggested in my other comment] the other possible way to deal with it is to set your starting point [your "stance"] such that the biased action/decision gets you to the right place.
“Everyone complains of his memory, but nobody of his judgment." This maxim of La Rochefoucauld rings as true today as it did back in the XVIIth century. People tend overestimate their reasoning abilities even when this overconfidence has a direct monetary cost. For instance, multiple studies have shown that investors who are more confident of their ability to beat the market receive lower returns on their investments. This overconfidence penalty applies even to the supposed experts, such as fund managers.
So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds”. To a certain extent this is already achieved by the society pressure to conform, which acts as an internal policeman in our minds. Yet those of us who deem themselves not very susceptible to such pressures (overconfidence, here we go again) might need to shift their views even further.
I invite you now to experiment on how this will work in practice. Quite a few of the recent posts and comments were speaking with derision about religion and the supernatural phenomena in general. Did the authors of these comments fully consider the fact that the existence of God is firmly believed by the majority? Or that this belief is not restricted to the uneducated but shared by many famous scientists, including Newton and Einstein? Would they be willing to shift their views to accommodate the chance that their own reasoning powers are insufficient to get the right answer?
Let the stone throwing begin.