Something that I haven't seen really discussed is what kind of emotional tools would be good for beginner rationalists. I'm especially interested in this topic since as part of my broader project of spreading rationality to a wide audience and thus raising the sanity waterline, I come across a lot of people who are interested in becoming more rational, but have difficulty facing the challenges of the Valley of Bad Rationality. In other words, they have trouble acknowledging their own biases and faults, facing the illusions within their moral systems and values, letting go of cached patterns, updating their beliefs, etc. Many thus abandon their aspiration toward rationality before they get very far. I think this is a systematic failure mode of many beginner aspiring rationalists, and so I wanted to start a discussion about what we can do about it as a community.
Note that this emotional danger does not feel intuitive to me or likely to many of you. In a Facebook discussion with Viliam Bur, he pointed out how he did not experience the Valley. I personally did not experience it that much either. However, based on the evidence of the Intentional Insights outreach efforts, this is a typical mind fallacy particular to many but far from all aspiring rationalists. So we should make an effort to address it in order to raise the sanity waterline effectively.
I'll start by sharing what I found effective in my own outreach efforts. First, I found it helpful to frame the aspiration toward rationality not as a search for a perfect and unreachable ideal, but as a way of constant improvement from the baseline where all humans are to something better. I highlight the benefits people get from this improved mode of thinking, to prime people to focus on their current self and detach themselves from their past selves. I highlight the value of self-empathy and self-forgiveness toward oneself for holding mistaken views, and encourage people to think of themselves as becoming more right, rather than less wrong :-)
Another thing that I found helpful was to provide new aspiring rationalists with a sense of community and social belonging. Joining a community of aspiring rationalists who are sensitive toward a newcomers' emotions, and help that newcomer deal with the challenges s/he experiences, is invaluable for overcoming the emotional strains of the Valley. Something especially useful is having people who are trained in coaching/counseling and serve as mentors for new members, who can help be guides for their intellectual and emotional development alike. I'd suggest that every LW meetup group consider instituting a system of mentors who can provide emotional and intellectual support alike for new members.
Now I'd like to hear about your experiences traveling the Valley, and what tools you and others you know used to manage it. Also, what are your ideas about useful tools for that purpose in general? Look forward to hearing your thoughts!
A few times I got a reaction like: "I don't want to hear your facts!" which I translated as: "If there is a part of reality that doesn't match my map, I don't want to know about that part."
The part "your facts" is already weird. As if saying that different people live in different realities, and I don't want my reality to become contaminated by your reality (which could happen if I start to observe your reality too close or under your guidance). But of course we are talking about maps here. So basicly "your facts" means: "There is only my map and your map, and I am not interested in your map." So it's not like I don't want my map to correspond to the territory, but rather like there is no territory that could judge my map and find it wanting. There are only maps, and of course your map is going to differ from my map, but if you insist on me looking at your map, that is merely an aggression, a status move.
(I can even see how our educational system contributes to this feeling that it's maps all the way down. Most of what happens in schools is students copying the teachers' maps. But I digress.)
EDIT: Another example, maybe better. There are people who love to tell "their opinions" on theory of relativity, quantum physics, evolution, whatever. But if you suggest thay they read a textbook, or a popular science book on the topic, to fix at least their most obvious misconceptions, they proudly refuse. They prefer their original bullshit interpretation, even if there is an option to fix the obvious mistakes and improve their bullshit to make it more credible (which IMHO should be preferable even for people who like their own bullshit theories).
I think that's more a case of people becoming jaded from constantly being presented with "facts" that are false or at least highly misleading backed by arguments too clever for them to refute.