Something that I haven't seen really discussed is what kind of emotional tools would be good for beginner rationalists. I'm especially interested in this topic since as part of my broader project of spreading rationality to a wide audience and thus raising the sanity waterline, I come across a lot of people who are interested in becoming more rational, but have difficulty facing the challenges of the Valley of Bad Rationality. In other words, they have trouble acknowledging their own biases and faults, facing the illusions within their moral systems and values, letting go of cached patterns, updating their beliefs, etc. Many thus abandon their aspiration toward rationality before they get very far. I think this is a systematic failure mode of many beginner aspiring rationalists, and so I wanted to start a discussion about what we can do about it as a community.
Note that this emotional danger does not feel intuitive to me or likely to many of you. In a Facebook discussion with Viliam Bur, he pointed out how he did not experience the Valley. I personally did not experience it that much either. However, based on the evidence of the Intentional Insights outreach efforts, this is a typical mind fallacy particular to many but far from all aspiring rationalists. So we should make an effort to address it in order to raise the sanity waterline effectively.
I'll start by sharing what I found effective in my own outreach efforts. First, I found it helpful to frame the aspiration toward rationality not as a search for a perfect and unreachable ideal, but as a way of constant improvement from the baseline where all humans are to something better. I highlight the benefits people get from this improved mode of thinking, to prime people to focus on their current self and detach themselves from their past selves. I highlight the value of self-empathy and self-forgiveness toward oneself for holding mistaken views, and encourage people to think of themselves as becoming more right, rather than less wrong :-)
Another thing that I found helpful was to provide new aspiring rationalists with a sense of community and social belonging. Joining a community of aspiring rationalists who are sensitive toward a newcomers' emotions, and help that newcomer deal with the challenges s/he experiences, is invaluable for overcoming the emotional strains of the Valley. Something especially useful is having people who are trained in coaching/counseling and serve as mentors for new members, who can help be guides for their intellectual and emotional development alike. I'd suggest that every LW meetup group consider instituting a system of mentors who can provide emotional and intellectual support alike for new members.
Now I'd like to hear about your experiences traveling the Valley, and what tools you and others you know used to manage it. Also, what are your ideas about useful tools for that purpose in general? Look forward to hearing your thoughts!
I'm aware of the possibility, and I have also mentioned it in the facebook debate. Or, more likely, I have problems finding the right words to express what I want to say:
I had situations where I didn't know something, when I forgot things, when I believed an information that was wrong, etc. Lots of them. Still doing it. Most likely will always do.
In the past (before finding LW) I have repeatedly experimented with belief in belief (because I wanted the placebo effects or social approval), but those experiments were always half-assed and very short-termed; they felt incompatible with my personality. I couldn't stop being aware that I am merely acting.
I also fail a lot at instrumental rationality. I am aware of what I should do... and I somehow just don't do it.
But I don't remember having a situation where I enjoyed being wrong or didn't care about being wrong, like described here and here. That just feels completely strange to me. I have problem empathising with people who, upon learning that they were wrong, just don't give a fuck.
Therefore -- that's why I mentioned it in the debate -- I have no clue about what to tell them to help them change their ways. I have never been there (as far as I know), and I have no idea what it feels like to be there. So I have no model that would help me test which ideas might be attractive enough to draw a person out of there.
EDIT: I feel like I should add so many disclaimers here. I am happy that at least Gleb understands what I was trying to say.
Of course there are reasons when you want to keep a map despite knowing it is not correct. When it is a useful simplification, like Newtonian physics. I am talking about people whose maps are not even approximately correct, but they still keep them because... I am only guessing here... they still provide emotional comfort.
I don't feel comfortable with having an obviously wrong map, even if it would be socially approved. I have problem belonging to most groups, because sooner or later there is a shared group map you have to accept. For example, having a political opinion (in the sense of: completely buying a standardized map) feels like insanity; on the same level as belonging to a cult. (I am strongly sympathetic to the libertarian ethics of not initiating force. That doesn't convince me that the best way to organize a society is to dismantle all states and let the warlords fight it out in the "free market".)
There may also be unlucky situations where I am wrong, other people are right, but they lack the right words to convince me (sometimes because they themselves believe the right thing for the wrong reasons, e.g. because it is a standard belief in their social group). But I don't have an epistemic strategy for avoiding such situations without making things worse on average; or course believing everything wouldn't be an improvement.
Etc.
Viliam, I indeed do understand what you're saying. Having a belief that I know is wrong is anathema to me.
But I think you and I, and probably many Less Wrongers, are on the far end of the spectrum of having a strong emotional valuation of having true beliefs, and there are so many people who give much less of a fuck about that than we do. Moreover, they have a strong emotional value of being attached to their beliefs.
That's why the project of spreading rationality is hard - only a small subset of the population has that strong intuitive value. This is why... (read more)