I say this just to offer evidence that something about "rationality" works.
Rationality working is one possible explanation of this, but it's not the only one or even the most likely.
There are all sorts of interesting sociological differences between actively religious people and the nonreligious, usually to the advantage of theists. They live longer, report greater happiness, are healthier by most measures of health, and I think have some protection against mental disease. Most studies investigating these advantages find they have nothing to do with the content of the religion and everything to do with the religion providing easy access to the religious community, a friendly and supportive social group to which other believers have an automatic "in".
I have a feeling this works in more subtle ways than just the obvious; it's not just about going to church and seeing people, but about slowly absorbing these people's norms (which are usually pretty positive in practice even when the theory behind them is repulsive) and internalizing their conception of you as a genuinely okay person because you're part of the in-group.
A lot of what you're talking about sounds p...
There are all sorts of interesting sociological differences between actively religious people and the nonreligious, usually to the advantage of theists. They live longer, report greater happiness, are healthier by most measures of health, and I think have some protection against mental disease. Most studies investigating these advantages find they have nothing to do with the content of the religion and everything to do with the religion providing easy access to the religious community, a friendly and supportive social group to which other believers have an automatic "in".
This is not to devalue the importance of the material - most of us would not fit into a religious community no matter how hard we tried
FWIW, I just spent the last two years in a church in hopes of achieving such benefits. A few weeks ago I classified the experiment as a failure -- I was more connected to others and generally happier in the one month I spent with the NYC rationalist community than at any time with the religious group, with which I had spent more time.
Just visited the UU church in Waco and went to their three hour intro. Looks to be compatible with me, something I don't have to put a mask on for.
My grandparents were Quakers. I've been to a few of their meetings. A Quaker meeting consists of everyone in the congregation sitting silently in a room, with individuals standing up to speak at irregular and unplanned intervals. In my experience, when people stand up to speak, they talk about the things that are important in their "spiritual" lives, which, in practice, means their emotional/moral lives. God was mentioned only in passing, and, aside from these mentions of God, I don't remember anything mystical.
It's possible, but I worry that our friendly local countersignalers are underestimating the power of being sane.
Most people stumble in with their friends. Your friends are the people you happen to sit next to at the first day of class, people who work in the same office as you, people who belong to the same clubs as you, people who go to the same bars as you. This is usually local because as the search radius increases, the amount of new data you have to deal with (people to filter out) becomes excessive.
It takes a strong sense of purpose to travel and hour and a half by train to meetup with strangers at an apartment in order to find a community, all based on the fact that you read the same blog. That is a very small part of search space.
There are many things that are claimed to give people large amounts of happiness. Most don't work, and many that work won't work for a given person. Quickly identifying what works for you, and making a beeline towards it is one of the largest benefits rationality can give a typical person. People see this and focus on the "it" (in this case finding a community) and say "of course that made you happy." This feels like hindsight bias. If you had met SarahC a year ago, would you have said to her "Oh, you obviously need to meet us with...
Something that occurred to me, inspired by many of the details of your story, was that actively seeking to cultivate rationality may internalize one's locus of control.
Locus of control is a measurable psychological trait that ranges from "internal" to "external" where an internal locus roughly indicates that you think events in your life are primarily affected by your self, your plans, your choices, and your skills. You can measure it generally or for specific domains and an internal locus of control is associated with more interest and participation in politics, and better management of diabetes.
My initial hypothesis for any particular person (reversing the fundamental attribution error and out of considerations of inferential distance) is generally that their personal locus of control is a basically accurate assessment of their abilities within the larger context of their life. If someone lives in a violent and corrupt country and lacks money, guns, or muscles then an external locus of control is probably a cognitive aspect of their honest and effective strategy for surviving by "keeping their head down". When I imagine trying to change someone's ...
Your story makes me wonder about connections in the other direction, from rationality to locus of control. It seems plausible that cultivated rationality might teach people to notice patterns, to find points of leverage, and to see the ways that they can affect the things that matter to them.
I would definitely say yes. There are people who have a tendency to think that if there's any major component of randomness involved in something, then it's pointless to try to make plans relating to that thing. Simply grokking expected utility and some very basic probability theory would help these people tremendously, while also shifting their locus of control inwards.
But now we're getting into "blaming the victim" territory with all the confusions inherent to politics. It makes me wonder if a strong desire to be sympathetic, translated into controversial political questions like these, limits a person's likely appreciation for cultivated rationality? Maybe the (Gendlin ignoring) logic would run: "If I believed people could have predicted and avoided their current tragic circumstances, then it will be harder for me to be sympathetic, but I want to be sympathetic so I should not believe that people could have predicted and avoided their tragedy."
I think it is better to be sympathetic regardless of whether the "people could have predicted and avoided their current tragic circumstances" (whatever the counterfactual means, maybe that a more rational person facing the same problem would have predicted and avoided the problem?).
Like Eliezer says:
It is not always helping people, to save them from the consequences of their own actions; but I draw a moral line at capital punishment. If you're dead, you can't learn from your mistakes.
I am going to go ahead and push that moral line out to cover paralyzing loss of autonomy.
I'm more comfortable viewing myself as a junior member of the Interesting-People Club.
The whole thing was good, but I particularly liked this bit. Humble-but-awesome is tough to pull off!
I made a unilateral decision to be happier, and though I hate to jinx it, I think it's working.
How is the superstition working out for you? ;)
Some of it has begun to show results -- the time-management habits I came up with have started to improve my academic performance
More on those PLEASE! (Or link if you already wrote about it.)
To my memory, I've been following along with this idea since before this blog, or Overcoming Bias, existed. I can't really say that it's done me all that much good.
Note also that this could be transformed from "good" to "timeless classic" by going through, taking all paragraphs about something abstract, and inserting at least one concrete example into each of them.
I for one would like to applaud the 20 members of the LessWrong community who just applauded Eliezer for applauding SarahC for applauding the LessWrong community.
Nice article.
One thing that I've found interesting is that rationality doesn't seem to make people happier by cleaning up their beliefs, so much as it does by inspiring more self-confidence.
Some quick ideas:
I find that rationality is much more attractive when you emphasize its ability to help you do things that you care about, rather than its ability to shoot down other ideas.
Rationality has greatly increased the rate at which my life changes, and made me a lot more comfortable in areas I wasn't before.
*Not sure if this is true.
This is inspiring. It is taking me a little longer (I discovered rationality two years ago); but I feel like I am on the brink of the next level of awesome. Thank you.
FWIW, I am inclined to think that "rationality" is a bad brand identification for a good thing. Rationality conjures up "Spock" (the Star Trek character) not "Spock" (the compassionate and wise child rearing guru). It puts an emphasis on a very inhuman part of the kind of human being you feel you are becoming.
Whatever it means in your context, as a brand to evangelize to others about its benefits, it is lacking. Better, in the sense of offering a positive vision, perhaps than "atheism" or "secularism" bu...
So far as I know I've been the one doing most of the asking, and I don't have a large enough sample size to declare anything, just seven people. The results have been mostly neutral, with one enthusiastically positive and two slightly negative. If I were to extrapolate from this, I'd say that enough people are at least neutral to the word that it won't harm us to use it.
If our goal was to find an optimum marketing word, I'd wait until we'd done much more substantial testing. But I think there's benefit to changing the Spock Perception, so as long as people are mostly neutral towards the word, it's worth using. (I'd still want more than seven responses before committing to it)
The specific question I've been asking people is:
"I'm just curious, if someone were to describe themselves as a Rationalist to you, what stereotypes would come to your mind about that person?"
(The first time I started by saying "what thoughts and feelings come to your mind if say "Rationality?" That prompted some questions and confusion that I don't have time for in the typical elevator ride, which is where I do the asking. By the third query I had narrowed it down to the phrasing abov...
I have heard of it.
I think it's an awful name, exactly on the grounds of having huge negative baggage. For me, at least, it has strong associations of smug, superior, condescending, and other such qualities.
I think this is a good occasion to point out a terrible bug:
Edited for concreteness.
Exactly one year ago, LessWrong helped me change my mind about something important.
Since then, my life has been changing very rapidly, as a direct result of the rationalist community. I got in touch with other rationalists in person, which made my social life vastly more interesting (not to say surreal). My plans for the future have definitely shifted a bit. I began a deliberate habit of trying new things and learning new skills, and facing up to my flaws, often with advice from LessWrongers or IRL rationalist friends.
A few examples: I improved my diet (paleo), tried yoga, took up cognitive behavioral therapy to work on some chronic insecurities, moved Python from the "wish I knew" box to the "have a detailed plan to learn" box, dared to publish some popular-science articles under my real name, learned to do Fermi calculations in my head. I also noticed that my habits of thought have been changing: for one thing, I'm getting better calibrated about probabilities -- I'm better at estimating how I did on schoolwork. For another thing, I'm getting better at not reflexively dismissing non-standard ideas: the first time someone mentioned me that a good statistician could make a lot of money in car insurance by finding new correlations to monetize, I thought "Car insurance? Hmph, low status." The second time I heard that suggestion, about five months later, I thought "Hey, that's a decent idea." Some of these changes have begun to show results -- the time-management habits* I came up with have started to improve my academic performance, and I notice I'm far less inhibited about taking the initiative to work on projects (I have a couple of interesting balls in the air now, including a business idea and some volunteer work for SIAI, whereas I used to be very reluctant to volunteer for things.) I've become much more open to cold-emailing people who work on interesting things (on one occasion I got a job offer out of an AI researcher); I'm more comfortable viewing myself as a junior member of the Interesting-People Club. I made a unilateral decision to be happier, and though I hate to jinx it, I think it's working.
I say this just to offer evidence that something about "rationality" works. I'm not sure what it is; many of the components of LessWrong-style rationality exist elsewhere (cognitive biases are fairly common knowledge; self-improvement hacks aren't unique to LessWrong; Bayesian statistics wasn't news to me when I got here). If anything, it's the sense that rationality can be an art, a superpower, a movement. It's the very fact of consolidating and giving a name and culture to the ideas surrounding how humans can think clearly. I'm never sure how much of that is a subjective primate in-group thing, but I'm hesitant to be too suspicious -- I don't want to blow out the spark before the fire has even started. My point is, there's something here that's worthwhile. It's not just social hour for nerds (not that we can't enjoy that aspect) -- it actually is possible to reach out to people and make a difference in how they live and see the world.
Once upon a time -- it seems like ages ago -- I used to envy a certain kind of person. The kind who has confidence that he can make a decent stab at ethical behavior without the threat of divine wrath. The kind who thinks that human beings have something to be proud of, that we're getting better at understanding the world and fitfully reducing suffering and injustice. The kind who thinks that he, personally, has some chance to make a valuable contribution. The kind who's audacious, who won't let anybody tell him what to think. The kind who whistles as he wins. Bertrand Russell seemed to be like that; also Robert Heinlein, and a couple of close friends of mine. That attitude, to me, seemed like a world of cloudless blue sky -- what a pity that I couldn't go there!
Ah, folly. Thing is, none of that attitude, strictly speaking, is rationality -- it might be what comes before rationality. It might be what makes rationality seem worthwhile. It might simply be the way you think if you read a lot of science fiction in your youth. But I've never seen it encouraged so well as here. When people ask me "What's a rationalist anyway," I tell them it's living the empirical life: trying to look at everything as though it's science, not just the lab -- trying different things and seeing what works, trying to actually learn from everything you observe.
I'm grateful for all this. While it's probably for the best that we don't pat ourselves on the back too much, I'm convinced that we should notice and appreciate what works. I used to be uncomfortable with evangelism, but now I tend to refer people to LessWrong when they mention a related idea (like complaining about incoherent arguments in debates). I think more visibility for us would be a good thing. I have plans to make a "rationality toy" of sorts -- I know other people have projects in that vein -- the more things we can create beyond the blog, the more alternate channels people have to learn about these ideas. And the more we can inspire the less confident among us that yes, you can do something, you can contribute.
*My anti-procrastination tactics are goal tracking via Joe's Goals and selective internet blocking via Self Control. Also posting my weekly goals to the New York Less Wrong mailing list. My problem up until now has really been spending too few hours on work -- in the bad old days I would frequently spend only 5 hours working on a weekday or 3 hours on a Saturday and the rest fooling around on the internet. I was really hooked on the intermittent stimulation of certain message boards, which I'm mostly glad to have given up. Now I'm aiming for 60-hour weeks. One thing that works in my favor is that I've almost completely stopped motivating myself by the ideal of being a "good girl" who receives approval; the reason I'm trying to get more work done is so that I can get credentials and preparation for the life I actually want to lead. I'm trying to be strategic, not ascetic. I don't know if what I've done is enough -- there's always someone who works harder or longer and seems to never need a break. But it's definitely better than nothing.