Epistemic status: Exploratory and in testing

I'm reasonably confident there are a lot of smart/curious people who would like to learn rationality, that is, how to think better and correlate the contents of your mind better to reality.

Framing rationalist outreach as establishing branches of the LW-community, Rationalist Clubs, Effective Altruist Meetups, etc. may be effective in growing the community to some extent, but anyone who doesn't already think of themselves as a rationalist will come only from whatever tribe the local branch seems to mainly consist of, whether that's weird engineers, animal-welfare vegans, crypto nuts, secular Buddhists, etc.

And then of course there are the cases where the assumption already exists that Effective Altruism is the thing Sam Bankman-Fried pretended he did before he stole all that money, and LessWrong is that place where they talk about how AI will become Evil Vaguely Judeo-Christian God who Tortures Us in the Future.

However, I am moving towards the conclusion that if detached from the tribal baggage, the majority of general-purpose debiasing tech/utilitarianism is not that inherently difficult to teach to smart/curious/motivated people, even from non-LW-median tribes. It is not outrageously Deep Magic to consider that students learn arbitrary parroting instead of knowledge, and continue to think like that after they graduate, or people use their moral philosophies to feel like they agree and associate with their tribes, and then go from there.

An old success case I found on my first search. I have had a pretty decent success rate with leftist-tribe friends and acquaintances as well and plan to continue testing. Of course, you need to have the norm of having genuine/abstract conversations first, but that's fun and useful to establish anyway.

(And any large-scale societal rise in the sanity waterline will presumably involve normalizing concepts outside the community, not expanding the community to that scale, so it's a good time to start.)

New to LessWrong?

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 4:57 PM
[-]TAG4mo20

Any interest in rationalist learning?

I agree that many of us outsiders would like to understand and utilise rationalist thinking. I did not, for example, notice the 'rationalist' take that ' AI will become Evil Vaguely Judeo-Christian God who Tortures Us in the Future'!