Related to: Lessons from Latter-day Saints, Building Rationalist Communities overview

This is my basic thesis:

Marx needed a Lenin. Fermi, Hahn and Meitner needed a Manhattan Project. EY and the Sequences need more clearly- and simply-defined rationality skills and methods for improving them.

Using Eliezer’s levels scheme, these are the three descending levels on which belief systems operate: theology, norms, and implementation.

I’ll give some examples. Here’s a general example, again from the Latter-day Saints:

  • Theology: God knows everything. Your purpose on earth is to become like God;
  • Norms: You should pursue as much education as possible.
  • Implementation: Create and operate a really big, cheap university system.[1]

Here’s one that I often dealt with as a missionary:

  • Theology: God is really good at making decisions. Your purpose on earth is to become like God.
  • Norms: You shouldn’t take alcohol, tobacco, tea, coffee, or addictive substances. Taking addictive substances impairs your ability to make correct decisions.
  • Implementation: We are going to bring you candy every week so that when you’re tempted to buy a cigarette, you can eat the strawberry toffee instead. (Or, we are going to stop by your house every day at 8:30pm to give you a boost, because going from 7 cups of coffee a day to 0 is tough.)

I did both of these (with different people), and they worked.

Norms and Implementation

As a missionary for the Church, my basic role was to:

  •  find people who were willing to try something out
  • design individualized “commitment systems” for each person, and
  • support them in implementing them.

There’s a lifestyle change here.

The “basic package,” (my terminology), which is a prerequisite to joining the church, includes: a strong focus on strengthening the family, daily family prayer and scripture study, the aforementioned health code, and sex only inside marriage. The glue is weekly church attendance, ensuring membership in a community that shares the same values.

After the “basic package,” it gets a bit more complex, as there are lots of higher-level elements of this lifestyle. To sample a few in no particular order:

  • Loving others. Developing gratitude. Keeping a journal. Following Church leaders. Inviting other people to church. Serving others, especially by accepting responsibilities in church. Pursuing education. Forgiving others.
  • Understanding that you have innate self-worth. Not gossiping. Dressing modestly. Being a good parent. Honoring and respecting parents. Keeping a budget. Doing family activities and not shopping on Sunday. Staying out of debt.
  • For the doubting, to continue living these habits, until they develop (expected) greater belief through experience.[2]

Obviously these are different than rationalist norms, but my point is that they are fairly comprehensive. Though each topic is fairly regularly discussed in church, it’s impossible to implement them into your life all at once. It’s easy to seem overwhelmed by the flood of new information. (Sound familiar?)

And that is why we were there, to design mini-programs for each person.[3] We would isolate a couple of specific standards that would be effective for person X, and assist in implementation. If they liked it and wanted more, we helped them implement the “basic package” lifestyle.

This decision, that they liked it and wanted more, was the single most crucial decision that someone could make. It is directly related to Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works.” I will discuss this further on a subsequent post.

Okay, so how does this apply to Less Wrongians?

Less Wrong has its version of a theological framework – the Sequences. They give a comprehensive set of statements about the way the world works, drawn from evolutionary psychology, anthropology, Bayesian statistics, etc.

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.

You may cite lukeprog’s guide. That’s good, but it’s only six posts. Less Wrong needs a lot more of it!

Or maybe you’ll say that if you read the Sequences carefully, etc, etc. Well, I did. Mysterious Answers to Mysterious Questions is 51 dense pages in Word, or about 25,000 words. This is an (extremely good) foundational text. It is not a how-to manual.[4]

Brevity is key to implementation.

For Latter-day Saints, the basic explanation of family standards is about 6000 words (95% of the important stuff is from page 4 to 15). The basic guide for teenagers is about 4000 words, and the basic guide for running a church organization is about 12000 words. And each one is very clear about what to do. (The teenage guide most clearly illustrates this point about brevity.)

The easiest way to begin building a how-to manual is for LW members to post specific, short personal examples of how they applied the principles of rationality in their day-to-day lives. Then they should collect all of the links somewhere, probably on the wiki.

If this sounds salesman-y or cheesy to you, or if you're extremely skeptical about religion, I quote a commenter on my last post. “If this works for people that are obviously crazy," said Vaniver, "that suggests it'll work for people who are (hopefully obviously) sane.”


[1] Admittedly, this also supports other norms, such as ‘marry another Latter-day Saint.’

[2]I’m not claiming this is perfect. Over the four years since I joined, I've encountered various amounts of ingroup snobbery, use of these standards to judge others, cliquishness, and intolerance towards certain groups, primarily gays. Plus all of the normal human imperfections.

[3] In designing and sequencing programs, we generally used a simple cost-benefit standard: how much will this help X vs. how much effort will it cost X?

[4] By comparison: the Bible is a foundational text of Christianity.  The Purpose-Driven Life  is a derivative how-to manual. This is a distillation of the Sequences, which is at least a start.

New Comment
149 comments, sorted by Click to highlight new comments since: Today at 7:12 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations. Strictly speaking, this is irrelevant to the message, but being confused about one's aims somewhat lowers my trust in one's suggestions. This is not to say that the suggestions themselves are suspicious or clearly wrong - on the contrary, they are written in an impartial style that very well fits into LW customs, which makes me even more curious about the author's background.

On a slightly different note, there is only so much one can improve on the organisational level, and we should keep in mind what expectations do we have from this community. In a sense, I second the Vladimir Nesov's and cousit_it's comments. Community building is instrumentally important, but really shouldn't become a major terminal value if we want to maintain a high level of rationality. Aspiring rationalists are in a danger of wandering in a strange circle: at first, they crave for getting rid of common biases and developing abilities for efficient truth seeking. But then, these very abilities lead them to discover th... (read more)

Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations.

Because there are things I can learn here. I can handle the hostility to religion. But if you don't cross-pollinate, you become a hick.

It doesn't seem to me to be possible to hold both rationality and religion in one's head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.

I can actually quite easily accept that it could be a good idea for rationalists to adopt some of the community-building practices of religious groups, but I also think that rationality is necessarily corrosive to religion.

If you've squared that circle, I'd be interested to hear how. Being somewhat religious for the social bit but having excised the supernaturalism is the only stable state I can think of.

compartmentalization, which is one of the things rationality seeks to destroy

Yes, in the ideal limit, rationalists don't compartmentalize. But decompartmentalizing too early, before you learn the skill of sanely resolving conflicts between beliefs and values in different compartments (rather than, for example, going with the one you feel more strongly), is a way to find the deepest, darkest crevices in the Valley of Rationality.

I also think that rationality is necessarily corrosive to religion.

I would agree that there is a level of rationality at which religious beliefs become impossible, though to the extent that a religious person takes their religion seriously, and expects effective rationality to produce accurate beliefs, they should not expect this until it actually happens to them. Though it does occur to me that helping rationalists to establish the level of social support available within the Mormon community (as calcsam is doing) is an effective way of establishing a line of retreat for a religious person who is uncertain about retaining their religious beliefs.

1handoflixue13y
Simple question, but what exactly is meant by "religion" when you say "there is a level of rationality at which religious beliefs become impossible"? I've been wondering about this for a while, and find it unclear whether my spiritual side is simply "not actually religion" or if there's just some huge chunk of rationality that I'm missing. Thus far, the two have felt entirely compatible for me.
3Perplexed13y
I wonder if you could clarify for me what you mean by "spiritual" in "spiritual side"? I was raised as a Roman Catholic, and to me 'spiritual' means the other side of Descartes's dualism - the non-physical side. So, for example, I learned that the Deity and angels are purely spiritual. But being human, my spiritual side is my immortal soul - which pretty much includes my mind. I'm pretty sure you (and millions of other people who talk about spirituality) mean something different from this, but I have never been able to figure out what you all mean. A definition of 'spiritual' is preferred, but failing that, could you taboo 'spiritual' and say what you meant by 'spiritual side' without using the word?
4handoflixue13y
More or less, it's schizophrenic/delusional episodes, with an awareness that this is in fact what they are. Mostly what I use 'spiritual' to refer to is that, during these episodes, I tend to pick up a strong sense of 'purpose' - high level goals end up developed. I have no clue how I develop these top-level goals, and I've never found a way to do it via rationality. Rationality can help me mediate conflicts between goals, conflicts between goals and reality, and help me achieve goals, but it doesn't seem able to set those top-level priorities. About the closest I've come to doing it rationally is to realise that I'm craving purpose, and do various activities that tend to induce this state. Guided meditation is ideal, since it seems to produce more 'productive' episodes. It varies heavily whether I will get any particularly useful purpose out of one of these episodes; many episodes are drifting and purposeless, and others result in either impossible goals or 'applause light' goals that have no actual substance attached. Ostensibly I could try to infer my goals from my emotional preferences, which I've been slowly working on as an alternative. Being bi-polar and having a number of other neurological instabilities makes it very difficult to get any sort of coherent mapping there, beyond very basic elements like 'will to live'. Even those basics can be unstable: For about a year I had no real preference on my own survival due to a particularly bad schizophrenic episode. I'd actually be rather curious how others handle the formation of top-level goals :) ---------------------------------------- I do also notice certain skills that I'm much more adept at when I'm having such an episode. I've observed this empirically, and can come up with rational explanations for it. I'm pretty certain the same results could be replicated rationally, either by studying the skills or by figuring out what I'm doing different during the schizophrenic episodes. I don't feel that 'spiri
2Swimmer963 (Miranda Dixon-Luinenburg) 13y
I find I have very little emotion attached to my highest-level goals. I'm not sure but I think I derive them by abstracting from my lower-level goals, which are based more on habit and emotion, and from ideas I absorb from books, etc. I then use them to try and make my lower-level goals less contradictory.
0Sniffnoy13y
Yeah, this does not seem to have much to do with what we are usually talking about when discussing religion, supernaturalism, etc.
1timtyler13y
FWIW, I typically use the term in a secular sense to refer to those with interests in items from this list: * meditation, religious experiences, drugs, altered states, yoga, chanting, buddhism, taoism, other eastern mysticism, martial arts and self-improvement.
2thomblake13y
One reason amongst many: inasmuch as your religion includes unquestionable dogma, it is anathema to rationality. (It is for this reason that being a philosopher, I am non-religious for methodological reasons; dead dogma is not allowed). Having a belief that you cannot question is effectively giving it a probability of 1, which will distort the rest of your Bayesian network in terrible ways. See Infinite Certainty.
4handoflixue13y
Having been raised Unitarian Universalist, I always find it very odd that "religion" is conflated with "unquestionable dogma". I don't think Unitarians have that any more than LessWrong does. That said, if "religion" is being used as a shorthand for "unquestionable dogma", then the comments about religion make significantly more sense :)
4Vladimir_M13y
I highly doubt that. For one, a glance at a typical Unitarian web page will show a comprehensive and consistent list of left-wing ideological positions. Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW? (Not that LW is perfect in this regard either, but compared to nearly any other place, credit must be given where it's due.) Of course, some would claim that old-fashioned religious dogma is somehow incomparably worse and more irrational than modern ideological dogma, so much that the UU stuff doesn't even deserve that designation. However, I don't think this position is defensible, unless we insist on a rather tortured definition of "dogma."
7handoflixue13y
The more I think about it, the more I find it difficult to answer this question. The main obstacle I'm running up against is that the two have very different communication styles, so the answer varies heavily depending on which communication style you're seeking. In my experiences, LessWrong is a very blunt, geeky approach to communication. It is also post-based, and thus neither real-time nor face-to-face. It's very good at problem solving and science. People are likely to try and refute my stance, or treat it as a factual matter to be empirically tested. Unitarian Universalist churches, by contrast, have been very polite and mainstream in their approach to communication. It's also in-person, and real-time interaction. They're very good at making people feel welcome and accepted. People are likely to simply accept that I happen to believe differently than them. People are likely to treat strong assertions as an article of faith, and therefore not particularly worth challenging. ---------------------------------------- I can't really find a way to translate between these two, so I can't really compare them. Viewed through a mainstream, polite filter, I see LessWrong as a place that is actively hateful of religion, and extremely intolerant towards it, to the point of being willing to reject perfectly useful ideas simply because they happen to come from a religious organization. Viewed through the blunt, geeky filter, I see UUs as blindly accepting and unwilling to actually challenge and dig in to an idea; I feel like I can have a very interesting discussion, but in many respects I'm a lot less likely to change someone's mind (although, in other respects, I'd have a lot more luck using Dark Arts to manipulate a church-goer)
3handoflixue13y
Well, there are seven formal UU values: *The inherent worth and dignity of every person; *justice, equity and compassion in human relations; *world peace, liberty and justice for all; and *respect for the interdependent web of all existence. *Acceptance of one another and encouragement to spiritual growth in our congregations; *a free and responsible search for truth and meaning; and *the right of conscience and the use of the democratic process within our congregation and in society at large. I would consider the first four to be values that are roughly shared with LessWrong, although there are definitely some differences in perspective. The fifth one, UUs focus on spiritual growth, LW focuses on growing rationality. The sixth principle is again shared. The seventh seems implemented in the LessWrong karma system, and I'd actually say LW does better here than the UUs. It's also worth noting that these are explicitly "shared values", and not a creed. The general attitude I have seen is that one should show respect and tolerance even to people who don't share these values. LessWrong is a place for rationalists to meet and discuss rationality. UU Churches are a place for UUs to meet and discuss their shared values. It doesn't serve LessWrong to have it dominated by "religion vs rationality" posts, nor posts trying to sell Christianity or de-convert rationalists. It doesn't serve the UUs to have church dominated by challenges to those values.

Well, there are seven formal UU values:

This is a list of applause lights, not a statement of concrete values, beliefs, and goals. To find out the real UU values, beliefs, and goals, one must ask what exact arrangements constitute "liberty," "justice," etc., and what exact practical actions will, according to them, further these goals in practice. On these questions, there is nothing like consensus on LW, whereas judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.

(To be precise, the applause lights list does include a few not completely vague goals, like e.g. "world peace," but again, this says next to nothing without a clear position on what is likely to advance peace in practice and what to do when trade-offs are involved. There also seems to be one concrete political position on the list, namely democratism. However, judging by the responses seen when democracy is questioned on LW, there doesn't seem to be a LW consensus on that either, and at any rate, even the notion of "democracy" is rather vague and weasely. I'm sure that the UU folks would be horrified by many things that have, or have historically had, firm democratic support in various places.)

3handoflixue13y
The core theme I've seen repeated across congregations is the "seven core principles" that I posted above. I've seen some degree of ideological consistency across those, but I've attended seen quite a few sermons discussing various perspectives on the seven core principles. It seems like a fairly common tradition to even invite speakers from other religions or affiliations to come and share their own thoughts. Certainly a bias towards those who are "compatible" with the group consensus, and there is some degree of "group think". LessWrong has this going for it as well, though: there's a strong thread of anti-religion bias, and I'd say there's a moderate pro-cryonics/singularity bias. I don't see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we should come to embrace our Lord and Saviour, Jesus Christ. Can you can point to something specific in the UU literature that makes you feel that they're less tolerant to dissent than LessWrong?

Can you can point to something specific in the UU literature that makes you feel that they're less tolerant to dissent than LessWrong?

Before I even click at a link to a Unitarian Universalist website, I know with very high probability that there is going to be a "social justice" section espousing ideological positions on a number of issues. And for any such section, I can predict with almost full certainty what precisely these positions will be before I even read any of it.

Now, the UU folks would probably claim that such agreement exists simply because these positions are correct. However, even if I agreed that all these positions are correct, given the public controversy over many of these issues, it would still seem highly implausible that such ideological uniformity could be maintained in practice in a group highly tolerant of dissent. In contrast, I see nothing comparable on LW.

You say:

LessWrong has this going for it as well, though: there's a strong thread of anti-religion bias, and I'd say there's a moderate pro-cryonics/singularity bias. I don't see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we

... (read more)
4handoflixue13y
I suppose I should reiterate this, as it seems to be unclear: My point was not that UUs don't have a degree of "group consensus." My point was that they do not treat it as an unquestionable dogma. That they generally have a "social values" page does not seem at all contradictory to this - the issue is whether they're willing to entertain discussion from opposing views. In my (anecdotal) experience as someone who has actually attended UU churches, the answer has been very strongly yes. If you have actual experiences to the contrary, or have seen websites from them that seem to make it vividly clear that dissent is not tolerated, I'd be genuinely curious to see this. It's entirely possible that my experiences aren't typical, but I haven't seen any evidence to support that theory.
3handoflixue13y
Tangentially: The discussion of actual issues and biases on LessWrong is appreciated. I've only been here briefly, so I haven't really gotten to know the community that well yet. This was sadly not clear in my original post, but my goal was to compare "looking at a public website" to "reading top-level posts". I've never seen a top-level post supporting Christianity or condemning the SIAI here. On an individual level, I'm sure there are people that hold those stances, just as there are individual UU members who don't agree with the values you're seeing on the UU websites. My point was simply "when you look at the 'public face' of an organisation, you're going to see some degree of consensus, because that's just how human organisations work"
7Dreaded_Anomaly13y
LessWrong FAQ: You don't see a lot of posts about how gravity doesn't really exist and it's just the Flying Spaghetti Monster pushing us down with his tentacles, either. Note the previous part of the sentence by Vladimir_M that you quoted: (emphasis added) There's a difference between consensus on empirical questions where the evidence falls overwhelmingly on one side, and consensus on higher-level ideological questions with a much less clear distribution of both evidence and arguments.
-6handoflixue13y
5wedrifid13y
Nevertheless there are some from time to time, as well as comments to effect and many more that are ambivalent.
9AdeleneDawner13y
For clarity: How do you think the members of your local UU congregation would react if one of their members turned up one day and said something along the lines of "you know, I've been thinking about it and doing the math, and it looks to me like war is actually pretty useful, instrumentally - it seems like it saves more lives than it takes, and at least in places with recruitment methods like ours, people who choose to be soldiers seem to get a fairly good deal out of it on average"?
9handoflixue13y
I've been to sermons on exactly that topic, so I'd have to argue that in my experience they take it very well.
0Peterdjones13y
Dictatorship of the Proletariat? Class struggle? Ownership of the means of Production? Universal Free Healthcare, even? Or did you mean the kind of lpoliicies that count as "left wing" in the US, and liberal/moderate/centre-left everywhere else.

Or did you mean the kind of lpoliicies that count as "left wing" in the US, and liberal/moderate/centre-left everywhere else.

"Everywhere else"? I hate to break the news, but there are other places under the Sun besides the Anglosphere and Western Europe! In most of the world, both by population and surface area, and including some quite prosperous and civilized places, many UU positions would be seen as unimaginably extremist. (Try arguing their favored immigration policies to the Japanese, for example.)

You are however correct that in other Western/Anglospheric countries, the level of ideological uniformity in the political mainstream is far higher than in the U.S., and their mainstream is roughly similar to the UU doctrine on many issues, though not all. (Among their intellectual elites, on the other hand, Unitarian Universalism might as well be the established religion.)

In any case, I didn't say that the UUs had the most extreme left-wing positions on everything. On the contrary, what they espouse is roughly somewhere on the left fringe of the mainstream, and more radical leftist positions are certainly conceivable (and held by some small numbers of people). What is significant for the purposes of this discussion is the apparent ideological uniformity, not the content of their doctrine. My points would hold even if their positions were anywhere to the left or right of the present ones, as long as they were equally uniform.

6Emile13y
There are some conservative Universal Unitarians, which seems to indicate that there isn't complete ideological uniformity.
6Vladimir_M13y
Point taken, and thanks for the interesting link. Googling around a bit more, it seems like there are a few groups like these, but they are small and extreme outliers without influence and status. Before writing my above comments, I checked out the links on the first few search pages that come up when you google "Unitarian Universalist," and I definitely encountered perfectly predictable and uniform positions advocated on those.
3Emile13y
In case you haven't encountered him before, Peter A Taylor, the author of that FAQ has some interesting articles on religion and politics: Rational Religion, The Market for Sanctimony, or Yet Another Space Alien Cult, What Does "Morality" Mean?, etc. - he apparently is a reader of LessWrong.
4Vladimir_M13y
Yes, I have rummaged around his website already. There is some interesting stuff there. Interestingly, in the "Market for Sanctimony" article, he confirms my impressions about Unitarian Universalism, contrary to the claims of User:handoflixue:
3handoflixue13y
My claim was about unquestionable dogma, and the UUs as a whole. I'm not sure how we can still be having this debate after someone else provided you links to UUs who question the dogma...
-8Peterdjones13y
4Swimmer963 (Miranda Dixon-Luinenburg) 13y
I was raised a Unitarian Universalist too, by agnostic parents. It probably has a lot to do with my generally positive attitude towards religion. (I now sing in a High Anglican church choir and attend services regularly mostly because I find it benefits my mental health.)
0thomblake13y
Given that Unitarianism was originally Christian and yet some UU's have collectively embraced atheism, you are probably right about that.
-4Peterdjones13y
ie they won't burn you at the stake, and they won't stick around to be questioned when they can find someone else to talk to who'll agree with them.
2handoflixue13y
I'm curious if you're being sarcastic or serious. It's hard to tell online :)
6calcsam13y
I'd be happy to answer that. But for purposes of keeping the thread more on the community organization topic, I wanted to channel discussion of my religious beliefs over on this discussion thread. Would you like to repost your comment over there?
6Gray13y
Hmm? Thomas Bayes was a Presbyterian minister, C. S. Peirce was Catholic and Newton was an unorthodox Christian described as "highly religious". I'd be more interested in seeing a list of esteemed rationalists who were not religious compared to such a list that were religious. In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.
8Nornagest13y
While I think there exists a level at which mainstream religious faith is inimical to epistemic rationality, I also think it's most likely a pretty advanced level, higher than most if not all of the regulars here have attained. (Note however that people can and do give up religion on grounds of rationality before hitting that level.) It's certainly possible to make substantial contributions to the advancement of human rationality in its present state while also being a theist, and that was still truer a few hundred years ago when the foundations of the art were being laid. That being said, there's also a distinction to be made between esteemed rationalists and esteemed scientists or mathematicians whose work contributed indirectly to LW-method rationality. Of the people that worked on the early foundations of statistics, Laplace is the only one I can think of offhand that strikes me as having had strong public commitments to rationality in this site's usual sense.
5XFrequentist13y
People who solved math problems useful for rationality but espoused false beliefs would not qualify as "esteemed rationalists" in my book. (Robert Aumann belongs on this list, by the way.)
3wilkox13y
More generally, "In any case, it is pretty clear that it is possible to hold rationality and irrationality in your head at the same time. This is basically how most people operate." I'm no more surprised to hear about a religious rationalist than I am when I notice yet another of my own irrational beliefs or practices.
3Bongo13y
He must be talking about LW-style rationality or X-rationality as distinguished from traditional rationality. And learning about X-rationality has been known to deconvert people on whom the traditional rationality -based arguments of Dawkins and skeptics didn't work. And then there are additional arguments for why X-rationality is the real thing and deserves to be called just rationality.
1XFrequentist13y
Yes.
4Emile13y
Nick Szabo has an interpretation of religious traditions that makes sense, though I'm not sure I fully agree. (Edit) This essay: Is Rational Religion possible? might be of interest too. It seems that there are significant numbers of Jews, Anglicans and Shintoists that don't believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions (I don't know if there are any of those among Mormons though, though after hearing calcsam I'd increase my expectation).
4David_Gerard13y
Culture is pretty strong. My girlfriend oscillates between Christianity and Paganism and is an active member of the local Church of England. They're currently trying to draft her as a volunteer for all sorts of things (the standard punishment for public display of sanity and competence). I'm a sceptical atheist and I'm on the fringes of but still pretty much somewhat part of said church community. Mind you, the C of E is hardly that unfriendly to atheists ... and Richard Dawkins still visits his and neither it nor he catch fire as he walks in ... Or: Yes, religion is supposedly about the silly ideas and mad old books, but only works if it's a community, and these will often include people who expressly repudiate the silly ideas and mad old books. "If the Church of England relied on Christians, it'd be sharing a room with the Flat Earth Society" - Shelley (TV show), quoted from memory.
1[anonymous]13y
I can think of at least two other stable states - in one, you've had an experience that has acted as strong Bayesian evidence for you of the evidence of $DEITY, but which is either a purely subjective experience or which is non-repeatable. As an example of this class of event, if I were to pray "Oh Lord, give me enough money to never have to work again" and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God. Another stable state might be someone who has been convinced by Frank Tipler's Omega Point hypothesis. Tipler himself is now clearly extremely irrational, but the hypothesis itself is taken seriously enough by people like David Deutsch (who is one of the less obviously-egregiously-stupid public intellectuals) that it's not obviously dismissable out-of-hand. I'm sure there are others, too. EDIT - when I said "in the next five years" I meant to type "the next five minutes", which would of course be much stronger evidence.
8Desrtopa13y
Do you really think that would be enough? Even if you don't think that the God hypothesis has a truly massive prior probability to overcome, you'd still have to reconcile this with the fact that most prayers for improbable things go unanswered, to the point that nobody has ever provided a convincing statistical demonstration that it has any effect except on people who know that prayers have been made. Taking this as sufficient Bayesian evidence to hold a belief in God seems like believing that a die is weighted because your roll came up a six, when you know that it's produced an even distribution of numbers in all its rolls together.
8Oscar_Cunningham13y
The reason why rationality destroys religion is precisely because there is no evidence of this kind. It's not a priori impossible to hold rationality and religion decompartmentalised in one's head, but it is impossible in this universe.
3endoself13y
Even in that case, a very powerful being messing with you is more likely than an uncaused, ontologically fundamental very powerful being (and not just because of the conjunction - a caused, reducible very powerful being is far more likely). Or did you just mean that this point was less obvious, so it would be harder for someone to realize that they were wrong?
2Swimmer963 (Miranda Dixon-Luinenburg) 13y
Was he more rational before? (I did read two of his books a while back, and I remember being very excited beforehand and very disappointed afterwards, but I can't remember enough specifics to say why.)
7[anonymous]13y
I believe so. His career path seems to go: 70s - studies with John Wheeler, makes some small but clever contributions to cosmology and relativistic physics. 80s - Co-writes widely praised book The Anthropic Cosmological Principle with John Barrow, first suggests Omega Point hypothesis 90s - Writes The Physics Of Immortality, laying out Omega Point hypothesis in much more detail and explicitly identifying Omega Point with God. People think this is clever but going a little far. Tipler's contract for a textbook on gravitation gets cancelled and the university at which he has tenure stop giving him pay-rises. 2000s - Writes The Physics Of Christianity, in which he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.
3JoshuaZ13y
The criticism of Obama was slightly more coherent than that. The Tribe paper in question really was an example of the common attempt for people to take ideas in math and physics and try to apply them as strong metaphors in other areas in ways that are really unhelpful and at best silly. In that regard, most of Tipler's criticism was straight on.
3[anonymous]13y
Yeah, except for two facts: Obama had no actual input into Tribe's paper. Tipler's physics in his paper is even less coherent than Tribe's.
3Swimmer963 (Miranda Dixon-Luinenburg) 13y
Ok that's really...random. (Overused and underdefined word but that was the response my brain gave me).
8TimFreeman13y
The Tipler/Obama/aether connection seemed bizarre enough that I looked it up: http://pajamasmedia.com/blog/obama-vs-einstein/ Some quotes: * Einstein’s general relativity is just a special case of Newtonian gravity theory incorporating the ether * Hamilton-Jacobi theory is deterministic, hence quantum mechanics is equally deterministic * There was absolutely nothing revolutionary about twentieth century physics. I agree on the "random" part.
6[anonymous]13y
That's nothing. Read the full paper - http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1271310 . Forty-five pages of the most gloriously wrong thinking you'll ever come across in your life. But then he'll come out with a piece of utterly lucid reasoning on applying Bayes' theorem to the Born probabilities like http://arxiv.org/abs/quant-ph/0611245 . Very, very strange man.
2gwern13y
I think this is a relevant rationality quote: http://lesswrong.com/lw/2ev/rationality_quotes_july_2010/28nw
4[anonymous]13y
Wow. While I'm unsurprised that Tipler would take issue with yet another poetical injection of something that superficially looks like quantum physics into yet another unrelated subject area, I'm more surprised that he'd express it in such a bizzare manner. There's a whole paragraph where he name-drops his academic genealogy. And then he acts like Obama is making these claims, when at best he contributed "analytic and research assistance", whatever that means. I read The Physics of Immortality as an undergrad in '04 and was skeptical of his major claims. I'm disappointed by his downward spiral into crackpot territory.
0handoflixue13y
Speaking solely for myself, I've found that my spiritual / religious side helps me to set goals and to communicate with my intuitions. Rationality is simply a tool for implementing those goals, and processing/evaluating that intuitive data. I've honestly found the hostility towards "spirituality writ large" here rather confusing, as the majority of the arguments seem to focus on a fairly narrow subset of religious beliefs, primarily Christian. I tend to write it off as a rather understandable bias caused by generalizing from "mainstream Christianity", though, so it doesn't really bother me. When people present actual arguments, I do try and listen in case I've missed something. Or, put another way: Rationality is for falsifiable aspects of my life, and spirituality is for the non-falsifiable aspects of my life. I can't have "incorrect" goals or emotions, but I can certainly fail to handle them effectively.

I can certainly fail to handle them effectively.

If 'spirituality' helps you to handle these things effectively, that is empirically testable. It is not part of the 'non-falsifiable' stuff. In fact, whatever you find useful about 'spirituality' is necessarily empirical in nature and thus subject to the same rules as everything else.

Most of the distaste for 'spirituality' here comes from a lack of belief in spirits, for which good arguments can be provided if you don't have one handy. If your 'spirituality' has nothing to do with spirits, it should probably be called something else.

6handoflixue13y
Hmmmmm, I'd never considered the idea of trying to falsify my goals and emotions before. Now that the idea has been presented, I'm seeing how I can further integrate my magical and rational thinking, and move to a significantly more effective and rational standpoint. Thank you!
0thomblake13y
Glad to be of help :)
-3JohnH13y
There are stats on the effects of religion on a population that practices said religion. This should give some indication of the usefulness of any spirituality.

I can't have "incorrect" goals or emotions

You can have goals that presuppose false beliefs. If I want to get to Heaven, and in fact there is no such place, my goal of getting to Heaven at least closely resembles an "incorrect goal".

This raises an interesting question -- if a Friendly AI or altruistic human wants to help me, and I want to go to Heaven, and the helper does not believe in Heaven, what should it do? So far as I can tell, it should help me get what I would want if I had what the helper considers to be true beliefs.

In a more mundane context, if I want to go north to get groceries, and the only grocery store is to the south, you aren't helping me by driving me north. If getting groceries is a concern that overrides others, and you can't communicate with me, you should drive me south to the grocery store even if I claim to want to go north. (If we can exchange evidence about the location of the grocery store, or if I value having true knowledge of what you find if you drive north, things are more complicated, but let's assume for the purposes of argument that neither of those hold.)

This leads to the practical experiment of asking religious peop... (read more)

7Eliezer Yudkowsky13y
Erm, that's supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)
1TimFreeman13y
The most reasonable interpretation I can find for your statement is that you're responding to this: I agree, the goal would be to figure out what they would want if their beliefs were revised, and revising their circumstances so that God puts Himself out of the picture isn't quite the same as that. The experiment also has other weaknesses: * Ebay bidding shows that many people can't correctly answer hypothetical questions. Perhaps people will accidentally give false information when I ask. * The question is obviously connected with a project related to athiesm. Perhaps some religious people will give false answers deliberately because they don't want projects related to athiesm to succeed. * The relevant question is what the FAI thinks they would want if there were no God, not what they think they would want. A decent FAI would be able to do evolutionary psychology and many people can't, especially religious people who don't think evolution happened. * It's not a real experiment. I'm not systematically finding these people, I'm just occasionally asking religious people what they think. There could easily be a selection effect since I'm not asking this question of random religious people.
5XFrequentist13y
We are at high risk of arguing about words, and I don't wish to do that. Describe specifically what you do when you're using your spiritual side. Assign it a label other than "spirituality" or "religious". Then I can give you my opinion. As stated your comment is noise.
5thomblake13y
You can have incorrect subgoals in that they fail to help you achieve the goals towards which they are supposed to aim. According to one popular view, you can have incorrect emotions - and this is important, as our emotions have a great deal to do with our ability to be rational. To quote:
2handoflixue13y
This comment was also quite helpful :)
2Swimmer963 (Miranda Dixon-Luinenburg) 13y
Maybe you disagree, but from what I've seen, a large subset of the LW population thinks that both goals and emotions can and should be modified if they are sub-optimal.
2[anonymous]13y
I can see handoflixue's logic, and your appeal to popularity does not defeat it. It makes LW seem to be irrational. To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal. Similarly, emotions can be subservient to goals. For example, anger can serve the goal of self-protection. A specific feeling of anger can then be judged as correct or incorrect depending on whether it serves this goal. Finally, all of our conscious goals can be judged from the standpoint of natural selection. And conversely, a person may judge natural selection from the point of view of his conscious goals.
0Swimmer963 (Miranda Dixon-Luinenburg) 13y
That...seems true. I guess I've never divided my goals into a hierarchy, and I often find my emotions annoying and un-useful. I think my comment holds more true for emotions than for goals, anyway. I'll have to think about this for a while. It's true that although I have tried to modify my top-level goals in the past, I don't necessarily do it because of rationality.
6[anonymous]13y
If you present your best case for LDS, and no one here considers it persuasive enough to convert to LDS, will you take this as strong evidence that you are mistaken? Enough to relinquish LDS? If not, why not?
-8JohnH13y
5Swimmer963 (Miranda Dixon-Luinenburg) 13y
I think that is an excellent attitude to take.
-1David_Gerard13y
+1
6AdeleneDawner13y
I like this definition.
5jasonmcdowell13y
You said what I'm thinking, only in complete paragraphs. Seriously. I was thinking: * his content is good, but I have questions and suspicions. * missionaries evolved a bunch of good techniques * can these techniques be used without negative side effects?

This sequence still feels like it is privileging the hypothesis of the desirability of LDS organizational practices, but you make a good point. We lack condensed introductions. Eliezer handed out copies of the Twelve Virtues of Rationality in the past, but I don't remember any other attempts at pamphlets or booklets. How much can the material be compressed with reasonable fidelity?

Some possible booklet ideas:

  • You Are A Brain: map/territory distinction, mind projection fallacy, reductionism, mysterious answers
  • Short guide to cognitive biases: examples of biases that can be directly illustrated and are less likely to be turned against others, like hindsight bias or the Wason selection task.
  • Checklists: e.g. Polya and Adam Savage on problem solving, Personal MBA questions to improve results
  • Sequence overviews: Scientific Self-help, 37 ways words can be wrong
  • Techniques/Heuristics: noticing confusion, holding off solutions, consider-the-opposite, tracking/data collection, being specific, and perhaps touching on deliberative vs automatic cognition or Haidt's elephant and rider metaphor
  • You are already living with the truth: Litany of Gendlin (plausibly the most important thing begin
... (read more)
2David_Gerard13y
Only insofar as it describes them at all. (Which, given what human brains are like, does automatically privilege the hypothesis inside them, annoyingly.). Remind yourself that there will be a space of other approaches to apply?

There's a lot to learn from your posts and I appreciate your effort. But speaking for myself, if the LW community follows this route en masse, it'll make me sad and eventually drive me away. I don't want to be associated with anything that uses such techniques to spread.

I think that you are responding to the post at too low a level of abstraction.

One of the pleasures of reading Isaac Asimov's Foundation is the scene in which Harry Seldon demonstrates the power of Psychohistory by designing an organisation, a variation of the Womens's Institute, that grows and grows and grows. (And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?). Ones intuition is that there are laws governing the dynamics of society and the rise and fall of empires. Can we discern enough of those laws to make sure that rationalism flourishes?

calcsam's post contains much food for thought, and I envy his plain, simple prose style. Hence my praise.

Reading between the lines to detect and reject a suggestion that we turn Less Wrong into a church along the line of Latter-day Saints is a mistake. Food for thought should be digested not puked up.

Some rationalists come unstuck by becoming addicted to alcohol. Do we want a no-drinking rule, like the Latter-day Saints have? I don't think we should even be asking the question. I don't think it is helpful to draw parallels so closely. But we don't have to just drop the issue.

We are aware ... (read more)

And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?

It's the short story "Snowball Effect" by Katherine MacLean. Collected in this anthology edited by Asimov.

4calcsam13y
Thanks Alan. You get my point. But even if rationalists don't adopt any LDS norms -- and I don't expected them/us to (I'm hovering at the edge of the community at the moment) -- understanding that the framework here should be helpful.

I don't want to be associated with anything that uses such techniques to spread.

Which techniques are you referring to? Techniques developed by the LDS, or techniques that have some other property?

I saw two concrete techniques in this post. "Help people implement positive changes through regular interaction" and "use brevity in your promotional materials." I can see problems with each of them, but I cannot see why those problems make them negative overall.

For the first, you might argue that you want Less Wrong to only comprise self-starters. I would direct you to the posts about akrasia, specifically the ones which detail how social influence can cut down on akrasia.

For the second, you might argue that we only want people with the high IQs and available free time who can get through the Sequences as they are. While the first might be desirable, I'm unsure about the second- Are the Sequences really the shortest they could possibly be? Should we really be attracting the people who are willing to read tens of thousands of words before getting results, instead of proving our positive effects to others before asking them to commit?

7cousin_it13y
My complaint isn't that the proposed meme-spreading techniques lack effectiveness. I agree that they're likely to be effective. My complaint is that I avoid people and communities that use such techniques, so if LW starts using them, I'll start avoiding LW.

My biggest concern is that simply spreading the memes will be counterproductive if they are not spread properly (partial understanding would be a huge concern). If everyone is just guessing the teacher's password, we will have probably done more damage than good.

I too have reservations about this material. But, I suspect that your recoiling and mine might stem from the representativeness heuristic - that anything associated with religion or its propagation techniques is Dark. I have a lot of negative associations there, and I'm betting many here do. However, religion's propagation techniques are unquestionably powerful, and we should be able to learn from them.

To counteract this bias somewhat, imagine having some system - I don't know the details yet, and neither do you - that would easily introduce the core concepts of rationality to anyone interested, and would make it easy to adopt those personal practices that one found positive or useful. Would this be a good thing?

I think so, yes, and clearly so -- if improved decision-making, easier willful action, and clearer analysis were already widespread, we'd all live with fewer of other people's bad decisions. Certainly, the clarity I've learned here has helped me; I'm willing to bet it'd help others as well.

If spreading rationality -- or, at least, making rationality more palatably available -- is a good thing, then we should as a community figure out what works to do that. If certain specific meme-spreading techniques are Dark or noxious, then we should identify what's wrong in them, and see if we can eliminate them without making the technique wholly ineffective.

So, can you identify what parts of those techniques are noxious to you?

The OP's previous post described a model of rationalist communities where you have "distillers" and "organizers" telling people to do stuff, some of which will be proselytizing. But I don't like being told what to do or telling others what to do, especially if it's proselytizing. So I would have no place in a such a community.

Also it seems to me that when a product needs to be resold by its consumers, like religion or Amway, that means the product probably isn't any good. Imagine Steve Jobs using MLM to sell the iPhone! If Eliezer's ideas about solving confusing problems actually helped, some of the many researchers who read LW would've found them useful and told us about it. And if the sequences were as useful in everyday life as they are well-written, a lot of people would have demonstrated that convincingly by now. In either case we would have an iPhone situation and would beat customers off with a stick, not struggle to attract them.

ETA: this comment is a joint reply to Vaniver, XFrequentist, fiddlemath and Davorak, because their questions were quite similar :-)

Okay, but also imagine Steve Jobs trying to sell the original iPod or iMac (before Apple's huge rebound in popularity) with no TV ads, billboards, posters, product placements, press releases, slogans, over-hyped conferences, or presence in retail stores; just a website with a bunch of extremely long articles explaining what's great about Apple products.

9cousin_it13y
In our case the task is simpler because the articles themselves are the product. Perhaps another reason for our disagreement is that I think new ideas should spread by their own merit, not through people explicitly trying to convert other people. The LW worldview is based on the ideas of many people (Daniel Kahneman, Hugh Everett, E.T. Jaynes, Judea Pearl, Robin Hanson...), neither of whom ever tried to build communities of laymen for the express purpose of spreading their ideas in MLM style. It makes me cringe to even imagine this.
2saturn13y
Actually, I agree with you. I only meant to point out that without some kind of deliberate promotion even really good ideas can be overlooked, but there are ways to do that without acting like Mormon missionaries.
5CronoDAS13y
This is very similar to how Magic: the Gathering gets spread; most players learned how to play from someone else.
5Vaniver13y
What community organizational structure are you comfortable with? What tradeoffs will you accept between organizational structure and goal satisfaction? Does rationalists telling other people about rationality make rationality worse? It seems to me, though, that the Mormon style of proselytizing is learning about prospective buyers, figuring out how to make their lives better, and then helping implement that. The benefit of that system is you have a knowledgeable person finding the highest-value tip for potential customers, which gets a lot more customers than having your catalog available at the public library and letting them find what suits them best. The question is not whether or not the tip is effective, but who pays to find that out.

What community organizational structure are you comfortable with? What tradeoffs will you accept between organizational structure and goal satisfaction?

I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.

I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.

Everything cousin_it says in this thread assume I said it as well.

0AdeleneDawner13y
Then why are you here at all? Or is LW included in the latter? (It's actually at the 'formal group' end of my social experience, but I may be more abnormal than I think...)

Yeah, LW is included in the latter. As far as I can tell, it doesn't yet require me to tell others what to do or be told what to do :-) For me it's a place to hang out with smart people and talk about interesting things, like an intellectual nightclub.

5wedrifid13y
One that doesn't have bouncers at the door to prevent guys coming in unless they bring chicks with them, to keep the balance...
1handoflixue13y
I would think there is some fruitful discussion to be had about which of these techniques are considered valuable, and which would be potentially alienating. Are you objecting to ideas such as "using brief summaries to sell the ideas" as well as to the idea of rationalist communities organized that way? I'd agree that I don't want to be part of such a rationalist community, but the idea of using brevity to help market seems useful.
5XFrequentist13y
Which specific techniques do you dislike?
3Davorak13y
What techniques/methodologies do you use to divide techniques you approve of and those you disapprove? Stepping through this process with one of the techniques in the OP's post would clear up much of why you would leave lesswrong if they became prominent.
4Cayenne13y
I'm not interested in making this into a church-like group either. Some norms are useful (e.g. respond to words with words and never ever with bullets), while others (you must marry a rationalist? you must never do $list-of-sins or else?) would be abhorrent to me. Helping people is a good goal to have. Regimenting their lives is not a goal I find appealing at all. I have enough trouble with my own life, why would I want to be in charge of other's lives too? Ick! Edit - please disregard this post

I agree with the sentiment, here, but I also think it's a bit of a knee-jerk reaction, and that with a bit of work we can come up with some norms that we're already using that we'd like to spread. Rationalist taboo comes to mind, and actually updating based on evidence, and generally changing one's behavior to match one's beliefs. That last one seems to require a bit more give and take than just handing someone a set of rules, but I think that's a good thing, and we could streamline the process by coming up with a list of common beliefs and behavioral implications thereof (cryo, for example).

Yes, I think it's a case of codifying the norms that are already held. Taking care to watch out for verbal overshadowing and similar errors.

The fears expressed by cousin_it and Cayenne are quite reasonable ones to hold, since every cause wants to be a cult. Perhaps take care to facilitate schisms and stay on good terms with them, to avoid the rules ossifying into detached levers and hence lost purposes?

Not that examples spring to mind. Perhaps annual Mixed Rationalist Arts tournaments, to keep a reality check in play.

(Heck, I think that one's a great first step, if we can work out what to compete on.)

9Cayenne13y
It's not a knee-jerk reaction, but more like a visceral rejection. The thought of this community becoming something with the feel of the church I grew up in makes me feel sick, and if it happened I would walk away and never look back. This is certainly a bias, but I would still do it. We need to have a clear and concise list of taboos, skills, and beliefs that we want to make into norms, and then the whole community has to talk them over and make sure that we really, really want to make them into our norms. If we're going to start adopting practices from other groups, I believe this should be our highest priority. Edit - please disregard this, it's needlessly prescriptive.
3AdeleneDawner13y
I noticed your comment about being ex-Mormon after I wrote the grandparent. Even without that context, but especially with it, this is reasonable, and a good warning signal for us to look out for. Please try to give us a heads-up if we start getting close to that point, too, ok? In a more general sense, I do think it's important to keep the look-and-feel of any organizational structure we put together different from the general look-and-feel of churches. I see a few advantages to this, most obviously that it will avoid driving away non-rationalist atheists and it will help remove us from direct competition with churches so that people who 'already have a religion, thanks' don't see that as a reason not to check us out. Absolutely. Also note: None of the above is intended as an actual endorsement of any particular plan. We should figure out what we might do before we decide whether to do that thing, as far as I'm concerned, and we're still in the first stage of that.

Well, to be slightly more clear, I am trans-gender. This is a sin in the LDS church, since the surgeries and hormones 'desecrate my temple' (temple == body). There is a limit to how much discrimination against ANY group I can stand before I leave, even if that group is 'those people that want to kill us because we don't believe in their god'.

At the same time, I really dislike the idea that I might be keeping a group from succeeding by giving negative input. I'm fairly likely to just withdraw without much fanfare if I decide that that is what's happening, since I hate drama and making a scene about something like that would feel like an attempted hostage situation. Just no.

Since I believe in the 'step-forward' method of getting things done, I'll start a discussion now to try to codify our norms.

Edit - please disregard this post, especially the last part. Empirical testing shows that I am not good at this kind of thing.

2NancyLebovitz13y
Could you expand on the things about LDS that you don't want to see replicated among rationalists?
5Cayenne13y
Some of it is difficult to pull apart into clear thought, but I'll try. I don't want to have a list of groups I have to hate to belong. I don't want to have someone trying to control my behavior by defining things as 'sin'. I don't want to be told 'we love you, we just don't like your actions', when it's clear that there is no love involved in any case. I don't want to have to remember people and feel sorry that they're part of a malignant memeplex, and that I can't do anything to help them. I don't want to dread going to a meet because I don't fit in. No, I really don't like the LDS church. That's probably never going to change, though I'll try not to influence others' decisions on the matter. I don't hate the members, I just feel sad when I think of them, and of my ex-family. Edit - please disregard this post
2Swimmer963 (Miranda Dixon-Luinenburg) 13y
That makes me sad too. I don't have a particularly negative attitude towards religion (alll my personal interactions with religions and religious people have been pretty positive and haven't included any of the aspects on your list) but I hear stories like yours about the incredibly toxic things people can do with their religions, and it depresses me, mostly because I don't think it's purely a symptom of people being religious. Otherwise, how could nearly all the religious people I've met be more accepting and less hypocritical about their daily life decisions than my atheist-by-default friends? It's more a symptom of people being flawed humans, and that is depressing.
0NancyLebovitz13y
How much of what you don't like about LDS is entangled with the organizational structure? I don't have a strong answer, just some concerns. It may be that a lot of what's wrong there is having a hard boundary between members and non-members. If so, rationalists may be able to beat that one by wanting people to be more rational, though there do seem to be some firm lines in this community, like being obligated to be a materialist. You may be stuck with that one, especially in regards to cryonics, at least in the sense that you can't do much to help them.
0Swimmer963 (Miranda Dixon-Luinenburg) 13y
I'm not sure what you mean by the word "materialist" in this context. Could you explain?
0Barry_Cotter13y
Materialist as in reductionist, as in Thou Art Physics not as in being materialistic.
0Swimmer963 (Miranda Dixon-Luinenburg) 13y
Ok (sigh of relief). That's what I thought but I'm not used to seeing the word used to mean 'reductionist'.
2DSimon13y
We are all reductionist girls*, living in a reductionist world. * For a sufficiently broad value of "girls", or a sufficiently narrow value of "all".
2saturn13y
Not exactly. Strange as it sounds, there are non-reductive materialist philosophers.
0Cayenne13y
It is hard to say. I have no doubt that rationalists will prevail eventually, and I wish luck to the ones that try. Edit - please disregard this post
1Vaniver13y
Isn't this the sort of thing you come here to overcome? Or am I just thinking of our sister site?
2Cayenne13y
It would have to be something I would want to overcome. I came here because the sequences were fascinating to read, but I find more and more that I simply can't consider myself to be rational in any meaningful way. I probably should try to overcome it, I suppose.

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.

Implementation of what? What's the purpose of these hypothetical norms? There's no point in propagating arbitrary norms. You are describing it backwards.

3Swimmer963 (Miranda Dixon-Luinenburg) 13y
I'm not sure that "norms" are the same as "desirable skills to develop". The LessWrong community definitely has a list of desirable skills: improve understanding of Bayes, for example.Maybe not well-defined though.
3AdeleneDawner13y
Honestly, I think I would have balked if calcsam had offered specific answers to this question, rather than the general principle of deriving them from the "theology". He is a relative outsider, and I think this is something we should be hashing out for ourselves.
6Vladimir_Nesov13y
I'm not sure it makes sense to build an idea around a premise that there is a central problem of propagating norms before we have some examples of norms which should be propagated.
3Eneasz13y
What about things like "overcome your biases", "raise the sanity waterline", and "win the stars"?
0Dorikka13y
Or, rather, norms which achieve these goals. These don't seem low-level enough to be norms in themselves.
0Davorak13y
I don't know the LDS example "Your purpose on earth is to become like God" is pretty big. Big goals are good if they are back up with supportive low-level goals.
-2wedrifid13y
You can't get all that much higher than 'win the stars'! :P
0Cayenne13y
It seems that the proper answer to this is to develop our norms in a rational manner, and reject arbitrary norms that have no purpose. Edit - please disregard this post

Just so you know, what you're advocating for LW are practices that have helped Christianity become a dominant and universalizing religion. Christians want everyone to be a Christian, that's basic to Christianity. Does, lets call it "rationalism", want everyone to be a rationalist? I guess that's a good question, and should be asked.

But lets also be mindful about how Christianity tries to attain a universal status:

"a strong focus on strengthening the family"

It is key that Christianity spreads within the family, and importantly, through generations. "Be fruitful and multiply" belongs here. You shouldn't have non-Christian members of the family.

"daily family prayer and scripture study"

Not just a strong family, but a strong Christian family. The ties of family should be used for religious purposes.

"sex only inside marriage"

Every natural human desire needs to be mediated with religious meaning and purpose, this makes people lustful for religion.

This is what you call "the basic package". The basic package has reasons for its existence, but not reasons that rationalists would necessarily agree with.

0DSimon13y
I think many proponents of rationalism do, judging by the popularity of the phrase "raising the sanity waterline" around the site. However, I also notice that LW is not as expansion-focused in its practical actions as its closest neighbor, the skeptical movement. I've heard some people express a fear that we might end up with forever-early rationalists who learn just enough to be dangerous and stop there, and later we end up having to correct the public misperceptions they generate. My personal view: I think spreading rationalism is good, but that we are best off focusing on people who are already primed for it, such as those who are already very into skepticism or science and are looking for something hardcore to sit at the center of those interests.

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop.

Actually changing your mind, learning the simple math of various fields,and becoming more luminous seem to represent a set of desirable skills to me, though I admit that is far from comprehensive. See also the twelve virtues of rationality.

Note that I'm not anywhere close to finished with my series 'The Science of Winning at Life.' I've barely started.

-2calcsam13y
Glad to hear that.

Brevity is key to implementation.

I like this idea. It can be difficult to read sequences if post after post you already agree with the idea. It would be wonderful to have the sequences compressed so that it was possible to easily find the sections which are new ideas so reading could be more targeted.

This clear and brief essay on the mechanics of community building is a valuable contribution to Less Wrong. Thank you.

There's nothing to "implement" in the sequences. They're philosophical blog posts.

OTOH, I guess you could find something to implement from any text if you thought hard enough. But I don't think the sequences straightforwardly mandate any particular way of life or something to implement.

(I've been editing this comment after posting it, sorry for confusion.)

The number of people on this site striving to become more rational, with the sequences as a foundation, suggests otherwise.

8jasonmcdowell13y
I'm going to start having kids in a few years. I have my eye of some of the sequenences - such as Mysterious Answers to Mysterious Questions. I need to find a way to distill this stuff down, so I can teach it to my children.