Optimizing the Twelve Virtues of Rationality
At the Less Wrong Meetup in Columbus, OH over the last couple of months, we discussed optimizing the Twelve Virtues of Rationality. In doing so, we were inspired by what Eliezer himself said in the essay:
-
Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue. If you think: “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.
So we first decided on the purpose of optimizing, and settled on yielding virtues that would be most impactful and effective for motivating people to become more rational, in other words optimizations that would produce the most utilons and hedons for the purpose of winning. There were a bunch of different suggestions. I tried to apply them to myself over the last few weeks and want to share my findings.
First Suggestion
Replace Perfectionism with Improvement
Motivation for Replacement
Perfectionism, both in how it pattern matches and in its actual description in the essay, orients toward focusing on defects and errors in oneself. By depicting the self as always flawed, and portraying the aspiring rationalist's job as seeking to find the flaws, the virtue of perfectionism is framed negatively, and is bound to result in negative reinforcement. Finding a flaw feels bad, and in many people that creates ugh fields around actually doing that search, as reported by participants at the Meetup. Instead, a positive framing of this virtue would be Improvement. Then, the aspiring rationalist can feel ok about where s/he is right now, but orient toward improving and growing mentally stronger - Tsuyoku Naritai! All improvement would be about gaining more hedons, and thus use the power of positive reinforcement. Generally, research suggests that positive reinforcement is effective in motivating the repetition of behavior, whereas negative reinforcement works best to stop people from doing a certain behavior. No wonder that Meetup participants reported that Perfectionism was not very effective in motivating them to grow more rational. So to get both more hedons, and thereby more utilons in the sense of the utility of seeking to grow more rational, Improvement might be a better term and virtue than perfectionism.
Self-Report
I've been orienting myself toward improvement instead of perfectionism for the last few weeks, and it's been a really noticeable difference. I've become much more motivated to seek ways that I can improve my ability to find the truth. I've been more excited and enthused about finding flaws and errors in myself, because they are now an opportunity to improve and grow stronger, not become less weak and imperfect. It's the same outcome as the virtue of Perfectionism, but deploying the power of positive reinforcement.
Second Suggestion
Replace Argument with Community
Motivation for Replacement
Argument is an important virtue, and a vital way of getting ourselves to see the truth is to rely on others to help us see the truth through debates, highlight mistaken beliefs, and help update on them, as the virtue describes. Yet orienting toward a rationalist Community has additional benefits besides the benefits of argument, which is only one part of a rationalist Community. Such a community would help provide an external perspective that research suggests would be especially beneficial to pointing out flaws and biases within one's ability to evaluate reality rationally, even without an argument. A community can help provide wise advice on making decisions, and it’s especially beneficial to have a community of diverse and intelligent people of all sorts in order to get the benefits of a wide variety of private information that one can aggregate to help make the best decisions. Moreover, a community can provide systematic ways to improve, through giving each systematic feedback, through compensating for each others' weaknesses in rationality, through learning difficult things together, and other ways of supporting each others' pursuit of ever-greater rationality. Likewise, a community can collaborate together, with different people fulfilling different functions in supporting all others in growing mentally stronger - not everybody has to be the "hero," after all, and different people can specialize in various tasks related to supporting others growing mentally stronger, gaining comparative advantage as a result. Studies show that social relationships impact us powerfully in numerous ways, contribute to our mental and physical wellbeing, and that we become more like our social network over time (1, 2, 3). This highlights further the benefits of focusing on developing a rationalist-oriented community of diverse people around ourselves to help us grow mentally stronger and get to the correct answer, and gain hedons and utilons alike for the purpose of winning.
Self-Report
After I updated my beliefs toward Community from Argument, I've been working more intentionally to create a systematic way for other aspiring rationalists in my LW meetup, and even non-rationalists, to point out my flaws and biases to me. I've noticed that by taking advantage of outside perspectives, I've been able to make quite a bit more headway on uncovering my own false beliefs and biases. I asked friends, both fellow aspiring rationalists and other wise friends not currently in the rationalist movement, to help me by pointing out when my biases might be at play, and they were happy to do so. For example, I tend to have an optimism bias, and I have told people around me to watch for me exhibiting this bias. They pointed out a number of times when this occurred, and I was able to improve gradually my ability to notice and deal with this bias.
Third Suggestion
Expand Empiricism to include Experimentation
Motivation for Expansion
This would not be a replacement of a virtue, but an expansion of the definition of Empiricism. As currently stated, Empiricism focused on observation and prediction, and implicitly in making beliefs pay rent in anticipated experience. This is a very important virtue, and fundamental to rationality. It can be improved, however, by adding experimentation to the description of empiricism. By experimentation I mean expanding simply observation as described in the essay currently, to include actually running experiments and testing things out in order to update our maps, both about ourselves and in the world around us. This would help us take initiative in gaining data around the world, not simply relying passively on observation of the world around us. My perspective on this topic was further strengthened by this recent discussion post, which caused me to further update my beliefs toward experimentation as a really valuable part of empiricism. Thus, including experimentation as part of empiricism would get us more utilons for getting at the correct answer and winning.
Self-Report
I have been running experiments on myself and the world around me long before this discussion took place. The discussion itself helped me connect the benefits of experimentation to the virtue of Empiricism, and also see the gap currently present in that virtue. I strengthened my commitment to experimentation, and have been running more concrete experiments, where I both predict the results in advance in order to make my beliefs pay rent, and then run an experiment to test whether my beliefs actually correlated to the outcome of the experiments. I have been humbled several times and got some great opportunities to update my beliefs by combining prediction of anticipated experience with active experimentation.
Conclusion
The Twelve Virtues of Rationality can be optimized to be more effective and impactful for getting at the correct answer and thus winning. There are many way of doing so, but we need to be careful in choosing optimizations that would be most optimal for the most people, as based on the research on how our minds actually work. The suggestions I shared above are just some ways of doing so. What do you think of these suggestions? What are your ideas for optimizing the Twelve Virtues of Rationality?
Instrumental vs. Epistemic -- A Bardic Perspective
(This article expands upon my response to a question posed by pjeby here)
I've seen a few back-and-forths lately debating the instrumental use of epistemic irrationality -- to put the matter in very broad strokes, you'll have one commenter claiming that a particular trick for enhancing your effectiveness, your productivity, your attractiveness, demands that you embrace some belief unsupported by the evidence, while another claims that such a compromise is unacceptable, since a true art should use all available true information. As Eliezer put it:
I find it hard to believe that the optimally motivated individual, the strongest entrepreneur a human being can become, is still wrapped up in a blanket of comforting overconfidence. I think they've probably thrown that blanket out the window and organized their mind a little differently. I find it hard to believe that the happiest we can possibly live, even in the realms of human possibility, involves a tiny awareness lurking in the corner of your mind that it's all a lie.
And with this I agree -- the idea that a fully developed rational art of anything would involving pumping yourself with false data seems absurd.
Still, let us say that I am entering a club, in which I would like to pick up an attractive woman. Many people will tell me that I must believe myself to be the most attractive, interesting, desirable man in the room. An outside-view examination of my life thus far, and my success with women in particular, tells me that I most certainly am not. What shall I do?
Feeling Rational
A popular belief about "rationality" is that rationality opposes all emotion—that all our sadness and all our joy are automatically anti-logical by virtue of being feelings. Yet strangely enough, I can't find any theorem of probability theory which proves that I should appear ice-cold and expressionless.
So is rationality orthogonal to feeling? No; our emotions arise from our models of reality. If I believe that my dead brother has been discovered alive, I will be happy; if I wake up and realize it was a dream, I will be sad. P. C. Hodgell said: "That which can be destroyed by the truth should be." My dreaming self's happiness was opposed by truth. My sadness on waking is rational; there is no truth which destroys it.
The peril of ignoring emotions
Related to: Luminosity Sequence, Unknown Knowns,
Let me introduce you to a hypothetical high school student, Sally. She’s smart and pretty and outgoing, and so are her friends. She considers herself a modern woman, sexually liberated, and this is in line with the lifestyle her friends practice. They think sex is normal and healthy and fun. Sally isn’t just pretending in order to fit in; these really are her friends, this really is her milieu, and according to health class, sex between consenting adults is nothing to be ashamed of. Sally isn't a rigorous rationalist, although she likes to think of herself as rational, and she's no more self-aware than the average high school girl.
Now Sally meets a boy, Bob, and she things he’s cute, and he thinks she’s cute too. Bob is part of her crowd. Her friends like him; he respects women and treats Sally well and, like any healthy teenage boy, fairly horny. According to her belief system, that shouldn’t set off any alarm bells. She’s been warned about abusive relationships, but Bob is a nice guy. So when they go upstairs together at her friend’s party, she has every reason to be excited and a little nervous, but not uncomfortable. The idea that Mom wouldn’t approve is so obviously irrelevant that she ignores it completely.
...And afterwards, she feels guilty and violated and horrible about herself, even though it was her decision.
I used this example because I expect it’s not unusual. On the surface, Sally’s discomfort seems to come out of nowhere, but modern North American society is chock-full of contradictory beliefs about sex. Sex is normal and healthy. Sex is dirty. Sex is only for when you’re married. If Sally’s mother is Christian, or even just conservative, Sally would have internalized those beliefs when she was a child. It would have been hard not to. They’re her unknown knowns, and she may not have noticed them before, because there’s a wide psychological gap between believing it’s okay for others to behave a particular way, and believing it’s okay for you. The meme ‘don’t pass judgement on other people’ is, I think, pretty widespread in North America and maybe more so in Canada, but so is holding oneself to a high standard...and those are contradictory.
I think that the nagging, seemingly irrational moment of ‘that doesn’t feel right’ is important. It potentially reveals something about the beliefs and attitudes you hold that you don’t even know about. Sally’s response to her nagging doubt could have been the following:
Hmm, that’s interesting, why does it bother me so much that Mom would disapprove? I guess when we used to go to church, they said sex was only for when you’re married. But I don’t believe anything else they said in church. ...Well, I guess I want Mom to be proud of me. I want her to praise me for doing well in school. And I think lying is wrong, so the fact that I either have to lie to her about having had sex, or face her disapproval, maybe that’s why I’m uncomfortable? But I don’t want to say no, it’ll make me look like a prude... Still, what if everyone feels this way at the start? I know Alice went to church too when she was a kid, and her mom would kill her if she knew she was sexually active, I wonder if that bothers Alice? Hmm, I think maybe it’s still the right choice to sleep with Bob, but maybe I’m taking this too lightly? Maybe this should be a big deal and I should feel anxious? After all, he might judge me anyway, he might think I’m too easy, or a slut. Maybe I can just explain to him that I want to think about this longer... After all, why should I assume something is right just because they told us in health class? That’s just like in church, it’s taking someone else’s opinion on faith. I’ve never actually thought about this, I’ve just followed other people. Who’s to say they’re right?
Whatever decision Sally makes, she probably won’t feel violated. She listened to her feelings and took them into consideration, even though they seemed irrational. As it turned out, they were a reasonable consequence of a belief-fragment that she hadn’t even known she had. So as a consequence of stopping to think, she knows herself better too. She’ll be better able to predict her behaviour in future situations. She’ll be less likely to ignore her threshold-warning discomfort and make risky choices as a result of peer pressure alone. She’ll be more likely to think.
To conclude: emotions exist. They are real. If you ignore them and plow on ahead, you won’t necessarily thank yourself afterwards. And that nagging feeling is a priceless moment to find out about your unknown knowns...which may not be rational, which may have been laid down in some previous era and never questioned since, but which part of you is going to try to uphold until you consciously deconstruct them.
How subjective is attractiveness?
Consider the two statements:
- There is a universal standard for beauty.
- Beauty is in the eye of the beholder.
Most people would agree that there's some truth to each of these statements. At Thing of Things Ozy wrote:
As for the beauty thing… well, yeah, everyone’s beautiful in the sense that everyone is sexually attractive to someone, and that human bodies in general are pretty cool-looking. But conventional attractiveness is still a thing. While I’m fairly conventionally attractive (thin, white, clear skin, symmetrical features), I doubt hairy legs, bound chests, and haircuts that make one look like a teenage boy are going to be all the rage at Cosmo any time soon.
This post explores the question of the extent to which each of the two statements is true, using data from a study of speed dating events conducted by Raymond Fisman and Sheena Iyengar.
The basic facts that I describe here are:
- Attractiveness as defined by group consensus can be modeled well using a normal distribution.
- The group consensus on somebody's attractiveness accounted for roughly 60% of the variance in people's perceptions of the person's relative attractiveness.
- The distribution of people's perceptions of the relative attractiveness of a fixed person can be modeled well using a normal distribution. Moreover, the standard deviations of these distributions tend to be quite close to one another (across different people), so that it's often possible to approximate the entire distribution of perceptions of somebody's relative attractiveness using only the mean of the distribution, which is just the group consensus on the person's attractiveness.
There's much more to say about how to interpret the group consensus and its implications, which I'll go into in a later post.
How my social skills went from horrible to mediocre
Over the past few months, I've become aware that my understanding of social reality had been distorted to an extreme degree. It took 29 years for me to figure out what was going on, but I finally now understand.
The situation is very simple: The amount of time that I put into interacting within typical social contexts was very small, so I didn't get enough feedback to realize that I had a major blindspot as I otherwise would have.
Now that I've identified the blindspot, I can work on it, and my social awareness has been increasing at very rapid clip. I had no idea that I had so much potential for social awareness. I had been in a fixed mindset as rather than a growth mindset, I had thought "social skills will never be my strong point, so I shouldn't spend time trying to improve them, instead I should focus on what I'm best at." I'm astonished by how much my relationships have improved over a span of mere weeks.
I give details below.
Cached Thoughts
One of the single greatest puzzles about the human brain is how the damn thing works at all when most neurons fire 10-20 times per second, or 200Hz tops. In neurology, the "hundred-step rule" is that any postulated operation has to complete in at most 100 sequential steps—you can be as parallel as you like, but you can't postulate more than 100 (preferably less) neural spikes one after the other.
Can you imagine having to program using 100Hz CPUs, no matter how many of them you had? You'd also need a hundred billion processors just to get anything done in realtime.
If you did need to write realtime programs for a hundred billion 100Hz processors, one trick you'd use as heavily as possible is caching. That's when you store the results of previous operations and look them up next time, instead of recomputing them from scratch. And it's a very neural idiom—recognition, association, completing the pattern.
It's a good guess that the actual majority of human cognition consists of cache lookups.
This thought does tend to go through my mind at certain times.
Extreme Rationality: It's Not That Great
Related to: Individual Rationality is a Matter of Life and Death, The Benefits of Rationality, Rationality is Systematized Winning
But I finally snapped after reading: Mandatory Secret Identities
Okay, the title was for shock value. Rationality is pretty great. Just not quite as great as everyone here seems to think it is.
For this post, I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training." It seems pretty uncontroversial that there are massive benefits from going from a completely irrational moron to the average intelligent person's level. I'm coining this new term so there's no temptation to confuse x-rationality with normal, lower-level rationality.
And for this post, I use "benefits" or "practical benefits" to mean anything not relating to philosophy, truth, winning debates, or a sense of personal satisfaction from understanding things better. Money, status, popularity, and scientific discovery all count.
So, what are these "benefits" of "x-rationality"?
A while back, Vladimir Nesov asked exactly that, and made a thread for people to list all of the positive effects x-rationality had on their lives. Only a handful responded, and most responses weren't very practical. Anna Salamon, one of the few people to give a really impressive list of benefits, wrote:
I'm surprised there are so few apparent gains listed. Are most people who benefited just being silent? We should expect a certain number of headache-cures, etc., just by placebo effects or coincidences of timing.
There have since been a few more people claiming practical benefits from x-rationality, but we should generally expect more people to claim benefits than to actually experience them. Anna mentions the placebo effect, and to that I would add cognitive dissonance - people spent all this time learning x-rationality, so it MUST have helped them! - and the same sort of confirmation bias that makes Christians swear that their prayers really work.
I find my personal experience in accord with the evidence from Vladimir's thread. I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines1, I can't think of any.
Looking over history, I do not find any tendency for successful people to have made a formal study of x-rationality. This isn't entirely fair, because the discipline has expanded vastly over the past fifty years, but the basics - syllogisms, fallacies, and the like - have been around much longer. The few groups who made a concerted effort to study x-rationality didn't shoot off an unusual number of geniuses - the Korzybskians are a good example. In fact as far as I know the only follower of Korzybski to turn his ideas into a vast personal empire of fame and fortune was (ironically!) L. Ron Hubbard, who took the basic concept of techniques to purge confusions from the mind, replaced the substance with a bunch of attractive flim-flam, and founded Scientology. And like Hubbard's superstar followers, many of this century's most successful people have been notably irrational.
There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it. The evidence in favor of the proposition right now seems to be its sheer obviousness. Rationality is the study of knowing the truth and making good decisions. How the heck could knowing more than everyone else and making better decisions than them not make you more successful?!?
This is a difficult question, but I think it has an answer. A complex, multifactorial answer, but an answer.
Atheism = Untheism + Antitheism
One occasionally sees such remarks as, "What good does it do to go around being angry about the nonexistence of God?" (on the one hand) or "Babies are natural atheists" (on the other). It seems to me that such remarks, and the rather silly discussions that get started around them, show that the concept "Atheism" is really made up of two distinct components, which one might call "untheism" and "antitheism".
A pure "untheist" would be someone who grew up in a society where the concept of God had simply never been invented - where writing was invented before agriculture, say, and the first plants and animals were domesticated by early scientists. In this world, superstition never got past the hunter-gatherer stage - a world seemingly haunted by mostly amoral spirits - before coming into conflict with Science and getting slapped down.
Hunter-gatherer superstition isn't much like what we think of as "religion". Early Westerners often derided it as not really being religion at all, and they were right, in my opinion. In the hunter-gatherer stage the supernatural agents aren't particularly moral, or charged with enforcing any rules; they may be placated with ceremonies, but not worshipped. But above all - they haven't yet split their epistemology. Hunter-gatherer cultures don't have special rules for reasoning about "supernatural" entities, or indeed an explicit distinction between supernatural entities and natural ones; the thunder spirits are just out there in the world, as evidenced by lightning, and the rain dance is supposed to manipulate them - it may not be perfect but it's the best rain dance developed so far, there was that famous time when it worked...
If you could show hunter-gatherers a raindance that called on a different spirit and worked with perfect reliability, or, equivalently, a desalination plant, they'd probably chuck the old spirit right out the window. Because there are no special rules for reasoning about it - nothing that denies the validity of the Elijah Test that the previous rain-dance just failed. Faith is a post-agricultural concept. Before you have chiefdoms where the priests are a branch of government, the gods aren't good, they don't enforce the chiefdom's rules, and there's no penalty for questioning them.
And so the Untheist culture, when it invents science, simply concludes in a very ordinary way that rain turns out to be caused by condensation in clouds rather than rain spirits; and at once they say "Oops" and chuck the old superstitions out the window; because they only got as far as superstitions, and not as far as anti-epistemology.
The Untheists don't know they're "atheists" because no one has ever told them what they're supposed to not believe in - nobody has invented a "high god" to be chief of the pantheon, let alone monolatry or monotheism.
Thinking well
Many people want to know how to live well. Part of living well is thinking well, because if one thinks the wrong thoughts it is hard to do the right things to get the best ends.
We think a lot about how to think well, and one of the first things we thought about was how to not think well. Bad ways of thinking repeat in ways we can see coming, because we have looked at how people think and know more now about that than we used to.
But even if we know how other people think bad thoughts, that is not enough. We need to both accept that we can have bad ways of thinking and figure out how to have good ways of thinking instead.
The first is very hard on the heart, but is why we call this place "Less Wrong." If we had called it something like more right, it could have been about how we're more right than other people instead of more right than our past selves.
The second is very hard on the head. It is not just enough to study the bad ways of thinking and turn them around. There are many ways to be wrong, but only a few ways to be right. If you turn left all the way around, it will point right, but we want it to point up.
The heart of our approach has a few parts:
- We are okay with not knowing. Only once we know we don't know can we look.
- We are okay with having been wrong. If we have wrong thoughts, the only way to have right thoughts is to let the wrong ones go.
- We are quick to change our minds. We look at what is when we get the chance.
- We are okay with the truth. Instead of trying to force it to be what we thought it was, we let it be what it is.
- We talk with each other about the truth of everything. If one of us is wrong, we want the others to help them become less wrong.
- We look at the world. We look at both the time before now and the time after now, because many ideas are only true if they agree with the time after now, and we can make changes to check those ideas.
- We like when ideas are as simple as possible.
- We make plans around being wrong. We look into the dark and ask what the world would look like if we were wrong, instead of just what the world would look like if we were right.
- We understand that as we become less wrong, we see more things wrong. We try to fix all the wrong things, because as soon as we accept that something will always be wrong we can not move past that thing.
- We try to be as close to the truth as possible.
- We study as many things as we can. There is only one world, and to look at a part tells you a little about all the other parts.
- We have a reason to do what we do. We do these things only because they help us, not because they are their own reason.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)