Rational Defense of Irrational Beliefs
“Everyone complains of his memory, but nobody of his judgment." This maxim of La Rochefoucauld rings as true today as it did back in the XVIIth century. People tend overestimate their reasoning abilities even when this overconfidence has a direct monetary cost. For instance, multiple studies have shown that investors who are more confident of their ability to beat the market receive lower returns on their investments. This overconfidence penalty applies even to the supposed experts, such as fund managers.
So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds”. To a certain extent this is already achieved by the society pressure to conform, which acts as an internal policeman in our minds. Yet those of us who deem themselves not very susceptible to such pressures (overconfidence, here we go again) might need to shift their views even further.
I invite you now to experiment on how this will work in practice. Quite a few of the recent posts and comments were speaking with derision about religion and the supernatural phenomena in general. Did the authors of these comments fully consider the fact that the existence of God is firmly believed by the majority? Or that this belief is not restricted to the uneducated but shared by many famous scientists, including Newton and Einstein? Would they be willing to shift their views to accommodate the chance that their own reasoning powers are insufficient to get the right answer?
Let the stone throwing begin.
Beginning at the Beginning
I can't help but notice that some people are utilizing some very peculiar and idiosyncratic meanings for the word 'rational' in their posts and comments. In many instances, the correctness of rationality is taken for granted; in others, the process of being rational is not only ignored, but dispensed with alltogether, and rational is defined as 'that which makes you win'.
That's not a very useful definition. If I went to someone looking for helping selecting between options, and was told to choose "the best one", or "the right one", or "the one that gives you the greatest chance of winning", what help would I have received? If I had clear ideas about how to determine the best, the right, or the one that would win, I wouldn't have come looking for help in the first place. The responses provide no operational assistance.
There is a definite lack of understanding here of what rationality is, much less why it is correct, and this general incomprehension can only cripple attempts to discuss its nature or how to apply it. We might think that this site would try to dispel the fog surrounding the concept. Remarkably, a blog established to help "refining the art of human rationality" neither explains nor defines rationality.
Those are absolutely critical goals if lesswrong is to accomplish what it advertises itself as attempting. So let's try to reach them.
The human mind is at the same time both extremely sophisticated and shockingly primitive. Most of its operations take place beneath the level of explicit awareness; we don't know how we reach conclusions and make decisions, we're merely presented with the results along with an emotional sense of rightness or confidence.
Despite these emotional assurances, we sometimes suspect that such feelings are unfounded. Careful examination shows that to be precisely the case. We can and do develop confidence in results, not because they are reliable, but for a host of other reasons.
Our approval or disapproval of some properties can cross over into our evaluation of others. We can fall prey to shortcuts while believing that we've been thorough. We tend to interpret evidence in terms of our preferences, perceiving what we want to perceive and screening out evidence we find inconvenient or uncomfortable. Sometimes, we even construct evidence out of whole cloth to support something we want to be true.
It's very difficult to detect these flaws in ourselves as we make them. It is somewhat easier to detect them in others, or in hindsight while reflecting upon past decisions which we are no longer strongly emotionally involved in. Without knowing how our decisions are reached, though, we're helpless in the face of impulses and feelings of the moment, even while we're ultimately skeptical about how our judgment functions.
So how can we try to improve our judgment if we don't even know what it's doing?
How did Aristotle establish the earliest-known examination of the principles of justification? If he originated the foundation of the systems we know as *logic*, how could that be accomplished without the use of logic?
As Aristotle noted, the principles he made into a set of formal rules already existed. He observed the arguments of others, noting how people defended positions and attacked the positions of others, and how certain arguments had flaws that could be pointed out while others seemed to possess no counters. His attempts to organize people's implicit understandings of the validity of arguments led to an explicit, formal system. The principles of logic were implicit before they were understood explicitly.
The brain is capable of performing astounding feats of computation, but our conscious grasp of mathematics is emulated and lacks the power of the system that creates it. We can intuitively comprehend how a projectile will move from just a glimpse of its trajectory, although solving the explicit partial differential equation that describes that motion is terrifically difficult, and virtually impossible to accomplish in real-time. Yet our explicit grasp of mathematics makes it possible for us to solve problems and comprehend ideas completely beyond the capacity of our hunter-gatherer ancestors, even though the processing power of our brains does not appear to have changed from those early days.
In the same way, our models of what proper thought means give us options and opportunities far beyond what our intuitive, unconscious reasoning makes possible, even though the conscious understanding works with much fewer resources than the unconscious.
When we consciously and deliberately model the evolution of one statement into another according to elementary rules that make up the foundation of logical consistency, something new and exciting happens. The self-referential aspects of that modeling permit us to compare the decisions presented to us by the parts of our minds beneath the threshold of our awareness and override them. We can evaluate our own evaluations, reaching conclusions that our emotions don't lead us to and rejecting some of those that they do.
That's what rationality is: having explicit and conscious standards of validity, and applying them in a systematic way. It doesn't matter if we possess an inner conviction that something is true - if we can't demonstrate that it can be generated from basic principles according to well-defined rules, it's not valid.
What makes this so interesting is that it's self-correcting. If we observe an empirical relationship that our understanding doesn't predict, we can treat it as a new fact. For example, let's say that we find that certain manipulations of tarot decks permit us to predict the weather, even though we have no idea of why the two should be correlated at all. With rationality, we don't need to know why. Once we've recognized that the relationship exists, it becomes rational for us to use it. Likewise, if a previously-useful relationship suddenly ceases to be, even though we have no theoretical grounds for expecting that to happen, we simply acknowledge the fact. Once we've done so, we can justify ignoring that which we previously considered to be evidence.
Human reasoning is especially plagued by superstitions, because it's easy for us to accept contradictory principles without acknowledging the inconsistency. But when we're forced to construct step-by-step justifications for our beliefs, contradiction is thrown into sharp relief, and can't be ignored.
Arguments that are not made explicitly, with conscious awareness of how each point is derived from fundamental principles and empirical observations, may or may not be correct. But they're never rational. Rational reasoning does not guarantee correctness; rational choice does not guarantee victory. What rationality offers is self-knowledge of validity. If rational standards are maintained when thinking, the best choice as defined by the knowledge we possess will be made. Whether it will be best when we gain new knowledge, or in some absolute sense, is unknown and unknowable until that moment comes.
Yet those who speak here often of the value of human rationality frequently don't do so by rational means. They make implicit arguments with hidden assumptions and do not acknowledge or clarify them. They emphasize the potential for rationality to bootstrap itself to greater and greater levels of understanding, yet don't concern themselves with demonstrating that their arguments arise from the most basic elements of reason. Rationality starts when we make a conscious attempt to understand and apply those basic elements, to emulate in our minds the principles that make the existence of our minds possible.
Are we doing so?
The Golem
Anthony Ravenscroft writing on why it is important, in a relationship, to honestly communicate your grievances to the other person:
If you don't present your gripes to the responsible party, you cannot humanly bury those complaints - it's just not possible to "forget" about something that has hurt or stung you. Actually, you are probably "testing" these complaints against your experience of the person, trying to figure out what they would say, how they would react. You create a simulacrum in order to argue this all out in your head, and thus to avoid unpleasantness. Certain conclusions are made, which you file away. When another problem comes up, you then test this against your estimates of the person, which have been expanded by your previous guesswork.
Eventually, you will have created this huge guesswork of assumptions, which are so far removed from the actual person that they likely have no bearing on the reality. I call this "a golem made of boxes", a warehouse-sized beast that has nothing to do with the simple small human being from which it is supposedly modeled.
When I have had such a golem used against me, I was told by my lover that she had kept a rather ugly situation from me "because I know how you'd react." I described to her exactly what the situation was, as I'd pieced it together very accurately (you can do this with the actions of humans, not the humans themselves). She was stunned. When I described for her how the root assumptions she had made were very largely off the mark, she actually became very angry with me, defending the golem as though it represented the truth, and therefore I must be lying! In the end, she could have better determined my reaction from writing down the possibilities on slips of paper and choosing one out of a hat. ...
The golem is handy, but almost entirely dishonest. It begins from faulty (incomplete, biased) data, and runs rapidly downhill from there.
The map and the territory. How have you had the golem used against you? When have you, yourselves, made the mistake of resorting to a golem and had it blow in your face?
Recommended Rationalist Resources
I thought Recommended Rationalist Reading was very useful and interesting. Now we have voting and threading it seems a good time to comprehensively gather opinions on online material.
Please suggest high-quality links related to or useful for improving rationality. It could be a blog, a forum, a great essay, a reference site, an e-book, anything clickable. Anyone interested can then check out what looks promising and report back.
[edit]
There seems to be confusion... The post's for online material, not physical books. We already have Recommended Rationalist Reading, but as that hasn't got threading and voting, if people think it's a good idea I (or someone else) can do a seperate post for books [metaedit] ...not happening, is against blog guidelines. [/metaedit]
Looks like we're getting lots of suggestions, so please don't forget to vote on them so busier readers have an idea which ones are worth more investigating!
[/edit]
Contributors - if making multiple suggestions, please give each their own comment so we can vote on them separately. Click 'Help' for how to do links.
Voters - for top level comments containing suggestions (as oppose to comments replying to suggestions) please vote on the quality of the resource, not anything else in the comment. If you feel strongly about the comment quality, just post a sub-comment.
Here's 3 to get started:
View more: Prev
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)