Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Holy Books (Or Rationalist Sequences) Don’t Implement Themselves

31 Post author: calcsam 10 May 2011 07:15AM

Related to: Lessons from Latter-day Saints, Building Rationalist Communities overview

This is my basic thesis:

Marx needed a Lenin. Fermi, Hahn and Meitner needed a Manhattan Project. EY and the Sequences need more clearly- and simply-defined rationality skills and methods for improving them.

Using Eliezer’s levels scheme, these are the three descending levels on which belief systems operate: theology, norms, and implementation.

I’ll give some examples. Here’s a general example, again from the Latter-day Saints:

  • Theology: God knows everything. Your purpose on earth is to become like God;
  • Norms: You should pursue as much education as possible.
  • Implementation: Create and operate a really big, cheap university system.[1]

Here’s one that I often dealt with as a missionary:

  • Theology: God is really good at making decisions. Your purpose on earth is to become like God.
  • Norms: You shouldn’t take alcohol, tobacco, tea, coffee, or addictive substances. Taking addictive substances impairs your ability to make correct decisions.
  • Implementation: We are going to bring you candy every week so that when you’re tempted to buy a cigarette, you can eat the strawberry toffee instead. (Or, we are going to stop by your house every day at 8:30pm to give you a boost, because going from 7 cups of coffee a day to 0 is tough.)

I did both of these (with different people), and they worked.

Norms and Implementation

As a missionary for the Church, my basic role was to:

  •  find people who were willing to try something out
  • design individualized “commitment systems” for each person, and
  • support them in implementing them.

There’s a lifestyle change here.

The “basic package,” (my terminology), which is a prerequisite to joining the church, includes: a strong focus on strengthening the family, daily family prayer and scripture study, the aforementioned health code, and sex only inside marriage. The glue is weekly church attendance, ensuring membership in a community that shares the same values.

After the “basic package,” it gets a bit more complex, as there are lots of higher-level elements of this lifestyle. To sample a few in no particular order:

  • Loving others. Developing gratitude. Keeping a journal. Following Church leaders. Inviting other people to church. Serving others, especially by accepting responsibilities in church. Pursuing education. Forgiving others.
  • Understanding that you have innate self-worth. Not gossiping. Dressing modestly. Being a good parent. Honoring and respecting parents. Keeping a budget. Doing family activities and not shopping on Sunday. Staying out of debt.
  • For the doubting, to continue living these habits, until they develop (expected) greater belief through experience.[2]

Obviously these are different than rationalist norms, but my point is that they are fairly comprehensive. Though each topic is fairly regularly discussed in church, it’s impossible to implement them into your life all at once. It’s easy to seem overwhelmed by the flood of new information. (Sound familiar?)

And that is why we were there, to design mini-programs for each person.[3] We would isolate a couple of specific standards that would be effective for person X, and assist in implementation. If they liked it and wanted more, we helped them implement the “basic package” lifestyle.

This decision, that they liked it and wanted more, was the single most crucial decision that someone could make. It is directly related to Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works.” I will discuss this further on a subsequent post.

Okay, so how does this apply to Less Wrongians?

Less Wrong has its version of a theological framework – the Sequences. They give a comprehensive set of statements about the way the world works, drawn from evolutionary psychology, anthropology, Bayesian statistics, etc.

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.

You may cite lukeprog’s guide. That’s good, but it’s only six posts. Less Wrong needs a lot more of it!

Or maybe you’ll say that if you read the Sequences carefully, etc, etc. Well, I did. Mysterious Answers to Mysterious Questions is 51 dense pages in Word, or about 25,000 words. This is an (extremely good) foundational text. It is not a how-to manual.[4]

Brevity is key to implementation.

For Latter-day Saints, the basic explanation of family standards is about 6000 words (95% of the important stuff is from page 4 to 15). The basic guide for teenagers is about 4000 words, and the basic guide for running a church organization is about 12000 words. And each one is very clear about what to do. (The teenage guide most clearly illustrates this point about brevity.)

The easiest way to begin building a how-to manual is for LW members to post specific, short personal examples of how they applied the principles of rationality in their day-to-day lives. Then they should collect all of the links somewhere, probably on the wiki.

If this sounds salesman-y or cheesy to you, or if you're extremely skeptical about religion, I quote a commenter on my last post. “If this works for people that are obviously crazy," said Vaniver, "that suggests it'll work for people who are (hopefully obviously) sane.”


[1] Admittedly, this also supports other norms, such as ‘marry another Latter-day Saint.’

[2]I’m not claiming this is perfect. Over the four years since I joined, I've encountered various amounts of ingroup snobbery, use of these standards to judge others, cliquishness, and intolerance towards certain groups, primarily gays. Plus all of the normal human imperfections.

[3] In designing and sequencing programs, we generally used a simple cost-benefit standard: how much will this help X vs. how much effort will it cost X?

[4] By comparison: the Bible is a foundational text of Christianity.  The Purpose-Driven Life  is a derivative how-to manual. This is a distillation of the Sequences, which is at least a start.

Comments (149)

Comment author: Gray 11 May 2011 02:20:06AM 8 points [-]

Just so you know, what you're advocating for LW are practices that have helped Christianity become a dominant and universalizing religion. Christians want everyone to be a Christian, that's basic to Christianity. Does, lets call it "rationalism", want everyone to be a rationalist? I guess that's a good question, and should be asked.

But lets also be mindful about how Christianity tries to attain a universal status:

"a strong focus on strengthening the family"

It is key that Christianity spreads within the family, and importantly, through generations. "Be fruitful and multiply" belongs here. You shouldn't have non-Christian members of the family.

"daily family prayer and scripture study"

Not just a strong family, but a strong Christian family. The ties of family should be used for religious purposes.

"sex only inside marriage"

Every natural human desire needs to be mediated with religious meaning and purpose, this makes people lustful for religion.

This is what you call "the basic package". The basic package has reasons for its existence, but not reasons that rationalists would necessarily agree with.

Comment author: DSimon 16 May 2011 06:54:09PM 0 points [-]

Does, lets call it "rationalism", want everyone to be a rationalist?

I think many proponents of rationalism do, judging by the popularity of the phrase "raising the sanity waterline" around the site.

However, I also notice that LW is not as expansion-focused in its practical actions as its closest neighbor, the skeptical movement. I've heard some people express a fear that we might end up with forever-early rationalists who learn just enough to be dangerous and stop there, and later we end up having to correct the public misperceptions they generate.

My personal view: I think spreading rationalism is good, but that we are best off focusing on people who are already primed for it, such as those who are already very into skepticism or science and are looking for something hardcore to sit at the center of those interests.

Comment author: lukeprog 11 May 2011 03:28:55AM 5 points [-]

Note that I'm not anywhere close to finished with my series 'The Science of Winning at Life.' I've barely started.

Comment author: calcsam 11 May 2011 07:46:19PM 0 points [-]

Glad to hear that.

Comment author: jschulter 11 May 2011 12:14:11AM 6 points [-]

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop.

Actually changing your mind, learning the simple math of various fields,and becoming more luminous seem to represent a set of desirable skills to me, though I admit that is far from comprehensive. See also the twelve virtues of rationality.

Comment author: prase 10 May 2011 11:56:39AM 22 points [-]

Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations. Strictly speaking, this is irrelevant to the message, but being confused about one's aims somewhat lowers my trust in one's suggestions. This is not to say that the suggestions themselves are suspicious or clearly wrong - on the contrary, they are written in an impartial style that very well fits into LW customs, which makes me even more curious about the author's background.

On a slightly different note, there is only so much one can improve on the organisational level, and we should keep in mind what expectations do we have from this community. In a sense, I second the Vladimir Nesov's and cousit_it's comments. Community building is instrumentally important, but really shouldn't become a major terminal value if we want to maintain a high level of rationality. Aspiring rationalists are in a danger of wandering in a strange circle: at first, they crave for getting rid of common biases and developing abilities for efficient truth seeking. But then, these very abilities lead them to discover that sometimes, an efficient way to do things is to compromise with human biology and use techniques which cooperate with the biases instead of fighting them. Often it turns out that the best techniques are already implemented by somebody - church missionaries for example - just because it is hard to compete with memes perfected by centuries-long testing and evolution.

In short, I am afraid that if we put too much weight into values (as community building) pursued by non-rationalist groups, it is likely that the most efficient methods are possessed by those non-rational groups (like churches) and we should (instrumentally) learn from them (as this series of posts claims). This wouldn't be itself bad if these methods weren't optimised for different sets of goals. It is very easy to adopt few manipulation techniques to strengthen the community coherence and be astonished by their efficacy while being completely unaware of all the non-obvious ways how these techniques undermine the main goal, which is rationality. When people start realising that something has gone wrong, it may be too late. We have already the label of dark arts for efficient but dangerous procedures. Some LWers may have already grown enough confident to believe that they can use the dark arts without being harmed. I am not so optimistic.

Comment author: jasonmcdowell 12 May 2011 10:26:23PM *  3 points [-]

You said what I'm thinking, only in complete paragraphs. Seriously. I was thinking:

  • his content is good, but I have questions and suspicions.
  • missionaries evolved a bunch of good techniques
  • can these techniques be used without negative side effects?
Comment author: calcsam 10 May 2011 02:56:33PM 31 points [-]

Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations.

Because there are things I can learn here. I can handle the hostility to religion. But if you don't cross-pollinate, you become a hick.

Comment author: [deleted] 13 May 2011 08:21:16PM 4 points [-]

If you present your best case for LDS, and no one here considers it persuasive enough to convert to LDS, will you take this as strong evidence that you are mistaken? Enough to relinquish LDS?

If not, why not?

Comment author: Swimmer963 10 May 2011 07:37:49PM 6 points [-]

Because there are things I can learn here.

I think that is an excellent attitude to take.

Comment author: XFrequentist 10 May 2011 03:26:00PM 10 points [-]

It doesn't seem to me to be possible to hold both rationality and religion in one's head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.

I can actually quite easily accept that it could be a good idea for rationalists to adopt some of the community-building practices of religious groups, but I also think that rationality is necessarily corrosive to religion.

If you've squared that circle, I'd be interested to hear how. Being somewhat religious for the social bit but having excised the supernaturalism is the only stable state I can think of.

Comment author: calcsam 12 May 2011 03:05:41PM 4 points [-]

I'd be happy to answer that. But for purposes of keeping the thread more on the community organization topic, I wanted to channel discussion of my religious beliefs over on this discussion thread. Would you like to repost your comment over there?

Comment author: handoflixue 11 May 2011 10:36:07PM 1 point [-]

Speaking solely for myself, I've found that my spiritual / religious side helps me to set goals and to communicate with my intuitions. Rationality is simply a tool for implementing those goals, and processing/evaluating that intuitive data.

I've honestly found the hostility towards "spirituality writ large" here rather confusing, as the majority of the arguments seem to focus on a fairly narrow subset of religious beliefs, primarily Christian. I tend to write it off as a rather understandable bias caused by generalizing from "mainstream Christianity", though, so it doesn't really bother me. When people present actual arguments, I do try and listen in case I've missed something.

Or, put another way: Rationality is for falsifiable aspects of my life, and spirituality is for the non-falsifiable aspects of my life. I can't have "incorrect" goals or emotions, but I can certainly fail to handle them effectively.

Comment author: TimFreeman 12 May 2011 12:19:34AM *  7 points [-]

I can't have "incorrect" goals or emotions

You can have goals that presuppose false beliefs. If I want to get to Heaven, and in fact there is no such place, my goal of getting to Heaven at least closely resembles an "incorrect goal".

This raises an interesting question -- if a Friendly AI or altruistic human wants to help me, and I want to go to Heaven, and the helper does not believe in Heaven, what should it do? So far as I can tell, it should help me get what I would want if I had what the helper considers to be true beliefs.

In a more mundane context, if I want to go north to get groceries, and the only grocery store is to the south, you aren't helping me by driving me north. If getting groceries is a concern that overrides others, and you can't communicate with me, you should drive me south to the grocery store even if I claim to want to go north. (If we can exchange evidence about the location of the grocery store, or if I value having true knowledge of what you find if you drive north, things are more complicated, but let's assume for the purposes of argument that neither of those hold.)

This leads to the practical experiment of asking religious people what they would do differently if their God spoke to them and said "I quit. From now on, the materialists are right, your mind is in your brain, there is no soul, no afterlife, no reincarnation, no heaven, and no hell. If your brain is destroyed before you can copy the information out, you're gone." If a religious person says they'd do something ridiculous if God quit, we have a problem when implementing an FAI, since the FAI would either believe in Heaven or be inclined to help religious people do something ridiculous.

So far, I've had one Jehovah's Witness say he couldn't imagine imagine God quitting. Everyone else said they wouldn't do much different if God quit.

If you do this experiment, please report back.

It would be a problem if there are many religious people who would apparently want to commit suicide if their God quit, the FAI convinces itself that there is no God, so it helpfully goes and kills them.

Comment author: Eliezer_Yudkowsky 14 May 2011 05:22:30AM 6 points [-]

Erm, that's supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)

Comment author: TimFreeman 14 May 2011 01:14:18PM *  1 point [-]

Erm, that's supposing the religious person would actually want to suicide or do the ridiculous thing, rather than this itself being an expression of belief, affirmation, and argument of the religion. (I.e., as appeal to consequences, or saying negative things about the negation.)

The most reasonable interpretation I can find for your statement is that you're responding to this:

If a religious person says they'd do something ridiculous if God quit, we have a problem when implementing an FAI, since the FAI would either believe in Heaven or be inclined to help religious people do something ridiculous.

I agree, the goal would be to figure out what they would want if their beliefs were revised, and revising their circumstances so that God puts Himself out of the picture isn't quite the same as that.

The experiment also has other weaknesses:

  • Ebay bidding shows that many people can't correctly answer hypothetical questions. Perhaps people will accidentally give false information when I ask.

  • The question is obviously connected with a project related to athiesm. Perhaps some religious people will give false answers deliberately because they don't want projects related to athiesm to succeed.

  • The relevant question is what the FAI thinks they would want if there were no God, not what they think they would want. A decent FAI would be able to do evolutionary psychology and many people can't, especially religious people who don't think evolution happened.

  • It's not a real experiment. I'm not systematically finding these people, I'm just occasionally asking religious people what they think. There could easily be a selection effect since I'm not asking this question of random religious people.

Comment author: thomblake 11 May 2011 10:47:06PM 7 points [-]

I can certainly fail to handle them effectively.

If 'spirituality' helps you to handle these things effectively, that is empirically testable. It is not part of the 'non-falsifiable' stuff. In fact, whatever you find useful about 'spirituality' is necessarily empirical in nature and thus subject to the same rules as everything else.

Most of the distaste for 'spirituality' here comes from a lack of belief in spirits, for which good arguments can be provided if you don't have one handy. If your 'spirituality' has nothing to do with spirits, it should probably be called something else.

Comment author: handoflixue 11 May 2011 11:57:23PM 4 points [-]

Hmmmmm, I'd never considered the idea of trying to falsify my goals and emotions before. Now that the idea has been presented, I'm seeing how I can further integrate my magical and rational thinking, and move to a significantly more effective and rational standpoint.

Thank you!

Comment author: thomblake 12 May 2011 12:01:29AM 0 points [-]

Glad to be of help :)

Comment author: JohnH 14 May 2011 08:36:35AM -1 points [-]

There are stats on the effects of religion on a population that practices said religion. This should give some indication of the usefulness of any spirituality.

Comment author: XFrequentist 11 May 2011 11:47:35PM 3 points [-]

We are at high risk of arguing about words, and I don't wish to do that.

Describe specifically what you do when you're using your spiritual side. Assign it a label other than "spirituality" or "religious". Then I can give you my opinion. As stated your comment is noise.

Comment author: thomblake 11 May 2011 10:42:05PM 3 points [-]

I can't have "incorrect" goals or emotions

You can have incorrect subgoals in that they fail to help you achieve the goals towards which they are supposed to aim.

According to one popular view, you can have incorrect emotions - and this is important, as our emotions have a great deal to do with our ability to be rational. To quote:

Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts. If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm. Evaluate your beliefs first and then arrive at your emotions. Let yourself say: “If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool.” Beware lest you become attached to beliefs you may not want.

Comment author: handoflixue 11 May 2011 11:57:56PM 1 point [-]

This comment was also quite helpful :)

Comment author: Swimmer963 11 May 2011 10:39:05PM 1 point [-]

I can't have "incorrect" goals or emotions, but I can certainly fail to handle them effectively.

Maybe you disagree, but from what I've seen, a large subset of the LW population thinks that both goals and emotions can and should be modified if they are sub-optimal.

Comment author: [deleted] 11 May 2011 10:52:09PM 3 points [-]

I can see handoflixue's logic, and your appeal to popularity does not defeat it. It makes LW seem to be irrational. To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal. Similarly, emotions can be subservient to goals. For example, anger can serve the goal of self-protection. A specific feeling of anger can then be judged as correct or incorrect depending on whether it serves this goal.

Finally, all of our conscious goals can be judged from the standpoint of natural selection. And conversely, a person may judge natural selection from the point of view of his conscious goals.

Comment author: Swimmer963 12 May 2011 12:27:16AM 0 points [-]

To directly answer the logic, remind handoflixue that goals form a hierarchy of goals and subgoals, and a subgoal can be incorrect relative to a goal.

That...seems true. I guess I've never divided my goals into a hierarchy, and I often find my emotions annoying and un-useful. I think my comment holds more true for emotions than for goals, anyway. I'll have to think about this for a while. It's true that although I have tried to modify my top-level goals in the past, I don't necessarily do it because of rationality.

Comment author: Gray 11 May 2011 01:59:40AM 6 points [-]

Hmm? Thomas Bayes was a Presbyterian minister, C. S. Peirce was Catholic and Newton was an unorthodox Christian described as "highly religious". I'd be more interested in seeing a list of esteemed rationalists who were not religious compared to such a list that were religious. In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.

Comment author: Nornagest 11 May 2011 11:49:54PM *  5 points [-]

While I think there exists a level at which mainstream religious faith is inimical to epistemic rationality, I also think it's most likely a pretty advanced level, higher than most if not all of the regulars here have attained. (Note however that people can and do give up religion on grounds of rationality before hitting that level.) It's certainly possible to make substantial contributions to the advancement of human rationality in its present state while also being a theist, and that was still truer a few hundred years ago when the foundations of the art were being laid.

That being said, there's also a distinction to be made between esteemed rationalists and esteemed scientists or mathematicians whose work contributed indirectly to LW-method rationality. Of the people that worked on the early foundations of statistics, Laplace is the only one I can think of offhand that strikes me as having had strong public commitments to rationality in this site's usual sense.

Comment author: wilkox 11 May 2011 11:00:31PM 3 points [-]

In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.

More generally, "In any case, it is pretty clear that it is possible to hold rationality and irrationality in your head at the same time. This is basically how most people operate." I'm no more surprised to hear about a religious rationalist than I am when I notice yet another of my own irrational beliefs or practices.

Comment author: XFrequentist 11 May 2011 01:52:12PM *  3 points [-]

People who solved math problems useful for rationality but espoused false beliefs would not qualify as "esteemed rationalists" in my book.

(Robert Aumann belongs on this list, by the way.)

Comment author: Bongo 11 May 2011 12:16:15PM *  3 points [-]

He must be talking about LW-style rationality or X-rationality as distinguished from traditional rationality. And learning about X-rationality has been known to deconvert people on whom the traditional rationality -based arguments of Dawkins and skeptics didn't work. And then there are additional arguments for why X-rationality is the real thing and deserves to be called just rationality.

Comment author: XFrequentist 11 May 2011 01:51:04PM 1 point [-]

Yes.

Comment author: JGWeissman 10 May 2011 05:12:22PM *  7 points [-]

compartmentalization, which is one of the things rationality seeks to destroy

Yes, in the ideal limit, rationalists don't compartmentalize. But decompartmentalizing too early, before you learn the skill of sanely resolving conflicts between beliefs and values in different compartments (rather than, for example, going with the one you feel more strongly), is a way to find the deepest, darkest crevices in the Valley of Rationality.

I also think that rationality is necessarily corrosive to religion.

I would agree that there is a level of rationality at which religious beliefs become impossible, though to the extent that a religious person takes their religion seriously, and expects effective rationality to produce accurate beliefs, they should not expect this until it actually happens to them. Though it does occur to me that helping rationalists to establish the level of social support available within the Mormon community (as calcsam is doing) is an effective way of establishing a line of retreat for a religious person who is uncertain about retaining their religious beliefs.

Comment author: handoflixue 11 May 2011 10:46:02PM 1 point [-]

Simple question, but what exactly is meant by "religion" when you say "there is a level of rationality at which religious beliefs become impossible"? I've been wondering about this for a while, and find it unclear whether my spiritual side is simply "not actually religion" or if there's just some huge chunk of rationality that I'm missing. Thus far, the two have felt entirely compatible for me.

Comment author: Perplexed 12 May 2011 02:26:21AM 2 points [-]

I've been wondering about this for a while, and find it unclear whether my spiritual side is simply "not actually religion".

I wonder if you could clarify for me what you mean by "spiritual" in "spiritual side"? I was raised as a Roman Catholic, and to me 'spiritual' means the other side of Descartes's dualism - the non-physical side. So, for example, I learned that the Deity and angels are purely spiritual. But being human, my spiritual side is my immortal soul - which pretty much includes my mind.

I'm pretty sure you (and millions of other people who talk about spirituality) mean something different from this, but I have never been able to figure out what you all mean.

A definition of 'spiritual' is preferred, but failing that, could you taboo 'spiritual' and say what you meant by 'spiritual side' without using the word?

Comment author: timtyler 13 May 2011 09:59:26PM *  1 point [-]

FWIW, I typically use the term in a secular sense to refer to those with interests in items from this list:

  • meditation, religious experiences, drugs, altered states, yoga, chanting, buddhism, taoism, other eastern mysticism, martial arts and self-improvement.
Comment author: handoflixue 12 May 2011 10:50:53PM 3 points [-]

More or less, it's schizophrenic/delusional episodes, with an awareness that this is in fact what they are. Mostly what I use 'spiritual' to refer to is that, during these episodes, I tend to pick up a strong sense of 'purpose' - high level goals end up developed. I have no clue how I develop these top-level goals, and I've never found a way to do it via rationality. Rationality can help me mediate conflicts between goals, conflicts between goals and reality, and help me achieve goals, but it doesn't seem able to set those top-level priorities.

About the closest I've come to doing it rationally is to realise that I'm craving purpose, and do various activities that tend to induce this state. Guided meditation is ideal, since it seems to produce more 'productive' episodes. It varies heavily whether I will get any particularly useful purpose out of one of these episodes; many episodes are drifting and purposeless, and others result in either impossible goals or 'applause light' goals that have no actual substance attached.

Ostensibly I could try to infer my goals from my emotional preferences, which I've been slowly working on as an alternative. Being bi-polar and having a number of other neurological instabilities makes it very difficult to get any sort of coherent mapping there, beyond very basic elements like 'will to live'. Even those basics can be unstable: For about a year I had no real preference on my own survival due to a particularly bad schizophrenic episode.

I'd actually be rather curious how others handle the formation of top-level goals :)


I do also notice certain skills that I'm much more adept at when I'm having such an episode. I've observed this empirically, and can come up with rational explanations for it. I'm pretty certain the same results could be replicated rationally, either by studying the skills or by figuring out what I'm doing different during the schizophrenic episodes. I don't feel that 'spiritual' is necessarily a good label for this aspect; "intuition" or simply "changing my perceptual lens on reality" seem more accurate. I mention it here simply because it happens to stem from the same source (schizophrenic episodes)

Comment author: Swimmer963 13 May 2011 01:13:40AM 2 points [-]

I'd actually be rather curious how others handle the formation of top-level goals :)

I find I have very little emotion attached to my highest-level goals. I'm not sure but I think I derive them by abstracting from my lower-level goals, which are based more on habit and emotion, and from ideas I absorb from books, etc. I then use them to try and make my lower-level goals less contradictory.

Comment author: Sniffnoy 13 May 2011 12:59:16AM 0 points [-]

Yeah, this does not seem to have much to do with what we are usually talking about when discussing religion, supernaturalism, etc.

Comment author: thomblake 11 May 2011 10:53:51PM 1 point [-]

One reason amongst many: inasmuch as your religion includes unquestionable dogma, it is anathema to rationality. (It is for this reason that being a philosopher, I am non-religious for methodological reasons; dead dogma is not allowed). Having a belief that you cannot question is effectively giving it a probability of 1, which will distort the rest of your Bayesian network in terrible ways. See Infinite Certainty.

Comment author: handoflixue 12 May 2011 12:00:20AM 1 point [-]

Having been raised Unitarian Universalist, I always find it very odd that "religion" is conflated with "unquestionable dogma". I don't think Unitarians have that any more than LessWrong does.

That said, if "religion" is being used as a shorthand for "unquestionable dogma", then the comments about religion make significantly more sense :)

Comment author: Vladimir_M 12 May 2011 12:33:55AM *  6 points [-]

Having been raised Unitarian Universalist, I always find it very odd that "religion" is conflated with "unquestionable dogma". I don't think Unitarians have that any more than LessWrong does.

I highly doubt that. For one, a glance at a typical Unitarian web page will show a comprehensive and consistent list of left-wing ideological positions. Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW? (Not that LW is perfect in this regard either, but compared to nearly any other place, credit must be given where it's due.)

Of course, some would claim that old-fashioned religious dogma is somehow incomparably worse and more irrational than modern ideological dogma, so much that the UU stuff doesn't even deserve that designation. However, I don't think this position is defensible, unless we insist on a rather tortured definition of "dogma."

Comment author: Peterdjones 12 May 2011 02:08:02PM 1 point [-]

a glance at a typical Unitarian web page will show a comprehensive and consistent list of left-wing ideological positions.

Dictatorship of the Proletariat? Class struggle? Ownership of the means of Production? Universal Free Healthcare, even?

Or did you mean the kind of lpoliicies that count as "left wing" in the US, and liberal/moderate/centre-left everywhere else.

Comment author: Vladimir_M 12 May 2011 05:30:42PM *  10 points [-]

Or did you mean the kind of lpoliicies that count as "left wing" in the US, and liberal/moderate/centre-left everywhere else.

"Everywhere else"? I hate to break the news, but there are other places under the Sun besides the Anglosphere and Western Europe! In most of the world, both by population and surface area, and including some quite prosperous and civilized places, many UU positions would be seen as unimaginably extremist. (Try arguing their favored immigration policies to the Japanese, for example.)

You are however correct that in other Western/Anglospheric countries, the level of ideological uniformity in the political mainstream is far higher than in the U.S., and their mainstream is roughly similar to the UU doctrine on many issues, though not all. (Among their intellectual elites, on the other hand, Unitarian Universalism might as well be the established religion.)

In any case, I didn't say that the UUs had the most extreme left-wing positions on everything. On the contrary, what they espouse is roughly somewhere on the left fringe of the mainstream, and more radical leftist positions are certainly conceivable (and held by some small numbers of people). What is significant for the purposes of this discussion is the apparent ideological uniformity, not the content of their doctrine. My points would hold even if their positions were anywhere to the left or right of the present ones, as long as they were equally uniform.

Comment author: handoflixue 12 May 2011 02:40:56AM 6 points [-]

Are you really claiming that if one were to express deep disagreement with those among the Unitarians, the reactions would be equally dispassionate, upfront, and open to argument as they usually are when the prevailing opinion is challenged on LW?

The more I think about it, the more I find it difficult to answer this question. The main obstacle I'm running up against is that the two have very different communication styles, so the answer varies heavily depending on which communication style you're seeking.

In my experiences, LessWrong is a very blunt, geeky approach to communication. It is also post-based, and thus neither real-time nor face-to-face. It's very good at problem solving and science. People are likely to try and refute my stance, or treat it as a factual matter to be empirically tested.

Unitarian Universalist churches, by contrast, have been very polite and mainstream in their approach to communication. It's also in-person, and real-time interaction. They're very good at making people feel welcome and accepted. People are likely to simply accept that I happen to believe differently than them. People are likely to treat strong assertions as an article of faith, and therefore not particularly worth challenging.


I can't really find a way to translate between these two, so I can't really compare them.

Viewed through a mainstream, polite filter, I see LessWrong as a place that is actively hateful of religion, and extremely intolerant towards it, to the point of being willing to reject perfectly useful ideas simply because they happen to come from a religious organization.

Viewed through the blunt, geeky filter, I see UUs as blindly accepting and unwilling to actually challenge and dig in to an idea; I feel like I can have a very interesting discussion, but in many respects I'm a lot less likely to change someone's mind (although, in other respects, I'd have a lot more luck using Dark Arts to manipulate a church-goer)

Comment author: handoflixue 12 May 2011 01:32:30AM *  2 points [-]

Well, there are seven formal UU values:

*The inherent worth and dignity of every person;

*justice, equity and compassion in human relations;

*world peace, liberty and justice for all; and

*respect for the interdependent web of all existence.

*Acceptance of one another and encouragement to spiritual growth in our congregations;

*a free and responsible search for truth and meaning; and

*the right of conscience and the use of the democratic process within our congregation and in society at large.

I would consider the first four to be values that are roughly shared with LessWrong, although there are definitely some differences in perspective. The fifth one, UUs focus on spiritual growth, LW focuses on growing rationality. The sixth principle is again shared. The seventh seems implemented in the LessWrong karma system, and I'd actually say LW does better here than the UUs.

It's also worth noting that these are explicitly "shared values", and not a creed. The general attitude I have seen is that one should show respect and tolerance even to people who don't share these values.

LessWrong is a place for rationalists to meet and discuss rationality. UU Churches are a place for UUs to meet and discuss their shared values. It doesn't serve LessWrong to have it dominated by "religion vs rationality" posts, nor posts trying to sell Christianity or de-convert rationalists. It doesn't serve the UUs to have church dominated by challenges to those values.

Comment author: Vladimir_M 12 May 2011 02:01:27AM *  12 points [-]

Well, there are seven formal UU values:

This is a list of applause lights, not a statement of concrete values, beliefs, and goals. To find out the real UU values, beliefs, and goals, one must ask what exact arrangements constitute "liberty," "justice," etc., and what exact practical actions will, according to them, further these goals in practice. On these questions, there is nothing like consensus on LW, whereas judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.

(To be precise, the applause lights list does include a few not completely vague goals, like e.g. "world peace," but again, this says next to nothing without a clear position on what is likely to advance peace in practice and what to do when trade-offs are involved. There also seems to be one concrete political position on the list, namely democratism. However, judging by the responses seen when democracy is questioned on LW, there doesn't seem to be a LW consensus on that either, and at any rate, even the notion of "democracy" is rather vague and weasely. I'm sure that the UU folks would be horrified by many things that have, or have historically had, firm democratic support in various places.)

Comment author: AdeleneDawner 12 May 2011 02:04:29AM 7 points [-]

For clarity: How do you think the members of your local UU congregation would react if one of their members turned up one day and said something along the lines of "you know, I've been thinking about it and doing the math, and it looks to me like war is actually pretty useful, instrumentally - it seems like it saves more lives than it takes, and at least in places with recruitment methods like ours, people who choose to be soldiers seem to get a fairly good deal out of it on average"?

Comment author: Swimmer963 12 May 2011 12:31:23AM 3 points [-]

Having been raised Unitarian Universalist, I always find it very odd that "religion" is conflated with "unquestionable dogma". I don't think Unitarians have that any more than LessWrong does.

I was raised a Unitarian Universalist too, by agnostic parents. It probably has a lot to do with my generally positive attitude towards religion. (I now sing in a High Anglican church choir and attend services regularly mostly because I find it benefits my mental health.)

Comment author: thomblake 12 May 2011 12:03:56AM 0 points [-]

Given that Unitarianism was originally Christian and yet some UU's have collectively embraced atheism, you are probably right about that.

Comment author: Peterdjones 12 May 2011 02:18:50PM -1 points [-]

I don't think Unitarians have that any more than LessWrong does.

ie they won't burn you at the stake, and they won't stick around to be questioned when they can find someone else to talk to who'll agree with them.

Comment author: handoflixue 12 May 2011 10:38:04PM 1 point [-]

I'm curious if you're being sarcastic or serious. It's hard to tell online :)

Comment author: Emile 10 May 2011 03:56:42PM *  4 points [-]

Nick Szabo has an interpretation of religious traditions that makes sense, though I'm not sure I fully agree.

(Edit) This essay: Is Rational Religion possible? might be of interest too.

It seems that there are significant numbers of Jews, Anglicans and Shintoists that don't believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions (I don't know if there are any of those among Mormons though, though after hearing calcsam I'd increase my expectation).

Comment author: David_Gerard 11 May 2011 09:05:02AM *  2 points [-]

It seems that there are significant numbers of Jews, Anglicans and Shintoists that don't believe in the theology and the supernatural stuff, but still identify as members of the religion and follow the traditions

Culture is pretty strong. My girlfriend oscillates between Christianity and Paganism and is an active member of the local Church of England. They're currently trying to draft her as a volunteer for all sorts of things (the standard punishment for public display of sanity and competence). I'm a sceptical atheist and I'm on the fringes of but still pretty much somewhat part of said church community. Mind you, the C of E is hardly that unfriendly to atheists ... and Richard Dawkins still visits his and neither it nor he catch fire as he walks in ...

Or: Yes, religion is supposedly about the silly ideas and mad old books, but only works if it's a community, and these will often include people who expressly repudiate the silly ideas and mad old books.

"If the Church of England relied on Christians, it'd be sharing a room with the Flat Earth Society" - Shelley (TV show), quoted from memory.

Comment author: [deleted] 10 May 2011 08:23:50PM *  0 points [-]

I can think of at least two other stable states - in one, you've had an experience that has acted as strong Bayesian evidence for you of the evidence of $DEITY, but which is either a purely subjective experience or which is non-repeatable. As an example of this class of event, if I were to pray "Oh Lord, give me enough money to never have to work again" and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.

Another stable state might be someone who has been convinced by Frank Tipler's Omega Point hypothesis. Tipler himself is now clearly extremely irrational, but the hypothesis itself is taken seriously enough by people like David Deutsch (who is one of the less obviously-egregiously-stupid public intellectuals) that it's not obviously dismissable out-of-hand.

I'm sure there are others, too.

EDIT - when I said "in the next five years" I meant to type "the next five minutes", which would of course be much stronger evidence.

Comment author: Oscar_Cunningham 10 May 2011 09:02:20PM 6 points [-]

It doesn't seem to me to be possible to hold both rationality and religion in one's head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.

As an example of this class of event, if I were to pray "Oh Lord, give me enough money to never have to work again" and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.

The reason why rationality destroys religion is precisely because there is no evidence of this kind. It's not a priori impossible to hold rationality and religion decompartmentalised in one's head, but it is impossible in this universe.

Comment author: Desrtopa 10 May 2011 10:54:13PM 4 points [-]

As an example of this class of event, if I were to pray "Oh Lord, give me enough money to never have to work again" and then two hundred thousand people were to buy copies of my books in the next five years, that would be enough evidence that it would be rational for me to believe in God.

Do you really think that would be enough? Even if you don't think that the God hypothesis has a truly massive prior probability to overcome, you'd still have to reconcile this with the fact that most prayers for improbable things go unanswered, to the point that nobody has ever provided a convincing statistical demonstration that it has any effect except on people who know that prayers have been made.

Taking this as sufficient Bayesian evidence to hold a belief in God seems like believing that a die is weighted because your roll came up a six, when you know that it's produced an even distribution of numbers in all its rolls together.

Comment author: endoself 10 May 2011 11:11:32PM 1 point [-]

Even in that case, a very powerful being messing with you is more likely than an uncaused, ontologically fundamental very powerful being (and not just because of the conjunction - a caused, reducible very powerful being is far more likely). Or did you just mean that this point was less obvious, so it would be harder for someone to realize that they were wrong?

Comment author: Swimmer963 10 May 2011 08:38:58PM 1 point [-]

Tipler himself is now clearly extremely irrational

Was he more rational before? (I did read two of his books a while back, and I remember being very excited beforehand and very disappointed afterwards, but I can't remember enough specifics to say why.)

Comment author: [deleted] 10 May 2011 08:50:41PM 4 points [-]

I believe so. His career path seems to go: 70s - studies with John Wheeler, makes some small but clever contributions to cosmology and relativistic physics.

80s - Co-writes widely praised book The Anthropic Cosmological Principle with John Barrow, first suggests Omega Point hypothesis

90s - Writes The Physics Of Immortality, laying out Omega Point hypothesis in much more detail and explicitly identifying Omega Point with God. People think this is clever but going a little far. Tipler's contract for a textbook on gravitation gets cancelled and the university at which he has tenure stop giving him pay-rises.

2000s - Writes The Physics Of Christianity, in which he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.

Comment author: JoshuaZ 11 May 2011 02:03:31AM 3 points [-]

The criticism of Obama was slightly more coherent than that. The Tribe paper in question really was an example of the common attempt for people to take ideas in math and physics and try to apply them as strong metaphors in other areas in ways that are really unhelpful and at best silly. In that regard, most of Tipler's criticism was straight on.

Comment author: [deleted] 11 May 2011 05:57:00PM 1 point [-]

Yeah, except for two facts: Obama had no actual input into Tribe's paper. Tipler's physics in his paper is even less coherent than Tribe's.

Comment author: Swimmer963 10 May 2011 09:26:08PM 2 points [-]

he suggests cloning Jesus from the Turin Shroud so we can learn how he annihilated baryons, becomes referee for creationist journals and occasional right-wing commentator, argues that Barack Obama is evil because the lumineferous aether is real and because of a bit of the film Starship Troopers.

Ok that's really...random. (Overused and underdefined word but that was the response my brain gave me).

Comment author: TimFreeman 10 May 2011 09:40:12PM 4 points [-]

The Tipler/Obama/aether connection seemed bizarre enough that I looked it up:

http://pajamasmedia.com/blog/obama-vs-einstein/

Some quotes:

  • Einstein’s general relativity is just a special case of Newtonian gravity theory incorporating the ether

  • Hamilton-Jacobi theory is deterministic, hence quantum mechanics is equally deterministic

  • There was absolutely nothing revolutionary about twentieth century physics.

I agree on the "random" part.

Comment author: [deleted] 10 May 2011 09:59:41PM 3 points [-]

That's nothing. Read the full paper - http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1271310 . Forty-five pages of the most gloriously wrong thinking you'll ever come across in your life.

But then he'll come out with a piece of utterly lucid reasoning on applying Bayes' theorem to the Born probabilities like http://arxiv.org/abs/quant-ph/0611245 . Very, very strange man.

Comment author: gwern 11 May 2011 01:58:12AM 3 points [-]

I think this is a relevant rationality quote: http://lesswrong.com/lw/2ev/rationality_quotes_july_2010/28nw

Comment author: paper-machine 10 May 2011 10:06:23PM 2 points [-]

Wow. While I'm unsurprised that Tipler would take issue with yet another poetical injection of something that superficially looks like quantum physics into yet another unrelated subject area, I'm more surprised that he'd express it in such a bizzare manner. There's a whole paragraph where he name-drops his academic genealogy. And then he acts like Obama is making these claims, when at best he contributed "analytic and research assistance", whatever that means.

I read The Physics of Immortality as an undergrad in '04 and was skeptical of his major claims. I'm disappointed by his downward spiral into crackpot territory.

Comment author: David_Gerard 11 May 2011 08:11:22AM 0 points [-]

But if you don't cross-pollinate, you become a hick.

+1

Comment author: AdeleneDawner 10 May 2011 12:28:50PM 4 points [-]

We have already the label of dark arts for efficient but dangerous procedures.

I like this definition.

Comment author: badger 10 May 2011 12:34:41PM *  17 points [-]

This sequence still feels like it is privileging the hypothesis of the desirability of LDS organizational practices, but you make a good point. We lack condensed introductions. Eliezer handed out copies of the Twelve Virtues of Rationality in the past, but I don't remember any other attempts at pamphlets or booklets. How much can the material be compressed with reasonable fidelity?

Some possible booklet ideas:

  • You Are A Brain: map/territory distinction, mind projection fallacy, reductionism, mysterious answers
  • Short guide to cognitive biases: examples of biases that can be directly illustrated and are less likely to be turned against others, like hindsight bias or the Wason selection task.
  • Checklists: e.g. Polya and Adam Savage on problem solving, Personal MBA questions to improve results
  • Sequence overviews: Scientific Self-help, 37 ways words can be wrong
  • Techniques/Heuristics: noticing confusion, holding off solutions, consider-the-opposite, tracking/data collection, being specific, and perhaps touching on deliberative vs automatic cognition or Haidt's elephant and rider metaphor
  • You are already living with the truth: Litany of Gendlin (plausibly the most important thing beginning rationalists can hear), On being ok with the truth
  • Identity and rationality: Keeping your identity small, cached selves, entropic nature of organizations, social vs individual rationality
Comment author: David_Gerard 11 May 2011 08:10:11AM *  1 point [-]

This sequence still feels like it is privileging the hypothesis of the desirability of LDS organizational practices

Only insofar as it describes them at all. (Which, given what human brains are like, does automatically privilege the hypothesis inside them, annoyingly.). Remind yourself that there will be a space of other approaches to apply?

Comment author: Davorak 10 May 2011 07:18:47PM 3 points [-]

Brevity is key to implementation.

I like this idea. It can be difficult to read sequences if post after post you already agree with the idea. It would be wonderful to have the sequences compressed so that it was possible to easily find the sections which are new ideas so reading could be more targeted.

Comment author: cousin_it 10 May 2011 08:42:13AM 14 points [-]

There's a lot to learn from your posts and I appreciate your effort. But speaking for myself, if the LW community follows this route en masse, it'll make me sad and eventually drive me away. I don't want to be associated with anything that uses such techniques to spread.

Comment author: Vaniver 10 May 2011 02:05:44PM 10 points [-]

I don't want to be associated with anything that uses such techniques to spread.

Which techniques are you referring to? Techniques developed by the LDS, or techniques that have some other property?

I saw two concrete techniques in this post. "Help people implement positive changes through regular interaction" and "use brevity in your promotional materials." I can see problems with each of them, but I cannot see why those problems make them negative overall.

For the first, you might argue that you want Less Wrong to only comprise self-starters. I would direct you to the posts about akrasia, specifically the ones which detail how social influence can cut down on akrasia.

For the second, you might argue that we only want people with the high IQs and available free time who can get through the Sequences as they are. While the first might be desirable, I'm unsure about the second- Are the Sequences really the shortest they could possibly be? Should we really be attracting the people who are willing to read tens of thousands of words before getting results, instead of proving our positive effects to others before asking them to commit?

Comment author: cousin_it 10 May 2011 02:29:53PM *  7 points [-]

My complaint isn't that the proposed meme-spreading techniques lack effectiveness. I agree that they're likely to be effective. My complaint is that I avoid people and communities that use such techniques, so if LW starts using them, I'll start avoiding LW.

Comment author: Zetetic 10 May 2011 04:14:14PM 12 points [-]

My biggest concern is that simply spreading the memes will be counterproductive if they are not spread properly (partial understanding would be a huge concern). If everyone is just guessing the teacher's password, we will have probably done more damage than good.

Comment author: fiddlemath 10 May 2011 04:05:21PM *  9 points [-]

I too have reservations about this material. But, I suspect that your recoiling and mine might stem from the representativeness heuristic - that anything associated with religion or its propagation techniques is Dark. I have a lot of negative associations there, and I'm betting many here do. However, religion's propagation techniques are unquestionably powerful, and we should be able to learn from them.

To counteract this bias somewhat, imagine having some system - I don't know the details yet, and neither do you - that would easily introduce the core concepts of rationality to anyone interested, and would make it easy to adopt those personal practices that one found positive or useful. Would this be a good thing?

I think so, yes, and clearly so -- if improved decision-making, easier willful action, and clearer analysis were already widespread, we'd all live with fewer of other people's bad decisions. Certainly, the clarity I've learned here has helped me; I'm willing to bet it'd help others as well.

If spreading rationality -- or, at least, making rationality more palatably available -- is a good thing, then we should as a community figure out what works to do that. If certain specific meme-spreading techniques are Dark or noxious, then we should identify what's wrong in them, and see if we can eliminate them without making the technique wholly ineffective.

So, can you identify what parts of those techniques are noxious to you?

Comment author: cousin_it 10 May 2011 06:41:29PM *  13 points [-]

The OP's previous post described a model of rationalist communities where you have "distillers" and "organizers" telling people to do stuff, some of which will be proselytizing. But I don't like being told what to do or telling others what to do, especially if it's proselytizing. So I would have no place in a such a community.

Also it seems to me that when a product needs to be resold by its consumers, like religion or Amway, that means the product probably isn't any good. Imagine Steve Jobs using MLM to sell the iPhone! If Eliezer's ideas about solving confusing problems actually helped, some of the many researchers who read LW would've found them useful and told us about it. And if the sequences were as useful in everyday life as they are well-written, a lot of people would have demonstrated that convincingly by now. In either case we would have an iPhone situation and would beat customers off with a stick, not struggle to attract them.

ETA: this comment is a joint reply to Vaniver, XFrequentist, fiddlemath and Davorak, because their questions were quite similar :-)

Comment author: handoflixue 12 May 2011 01:18:25AM 1 point [-]

I would think there is some fruitful discussion to be had about which of these techniques are considered valuable, and which would be potentially alienating. Are you objecting to ideas such as "using brief summaries to sell the ideas" as well as to the idea of rationalist communities organized that way?

I'd agree that I don't want to be part of such a rationalist community, but the idea of using brevity to help market seems useful.

Comment author: saturn 10 May 2011 07:21:44PM 10 points [-]

Okay, but also imagine Steve Jobs trying to sell the original iPod or iMac (before Apple's huge rebound in popularity) with no TV ads, billboards, posters, product placements, press releases, slogans, over-hyped conferences, or presence in retail stores; just a website with a bunch of extremely long articles explaining what's great about Apple products.

Comment author: cousin_it 10 May 2011 09:24:48PM *  6 points [-]

In our case the task is simpler because the articles themselves are the product. Perhaps another reason for our disagreement is that I think new ideas should spread by their own merit, not through people explicitly trying to convert other people. The LW worldview is based on the ideas of many people (Daniel Kahneman, Hugh Everett, E.T. Jaynes, Judea Pearl, Robin Hanson...), neither of whom ever tried to build communities of laymen for the express purpose of spreading their ideas in MLM style. It makes me cringe to even imagine this.

Comment author: saturn 11 May 2011 01:20:17AM 1 point [-]

Actually, I agree with you. I only meant to point out that without some kind of deliberate promotion even really good ideas can be overlooked, but there are ways to do that without acting like Mormon missionaries.

Comment author: CronoDAS 10 May 2011 08:48:57PM 4 points [-]

Also it seems to me that when a product needs to be resold by its consumers, like religion or Amway, that means the product probably isn't any good.

This is very similar to how Magic: the Gathering gets spread; most players learned how to play from someone else.

Comment author: Vaniver 10 May 2011 08:24:30PM 4 points [-]

But I don't like being told what to do or telling others what to do, especially if it's proselytizing. So I would have no place in a such a community.

What community organizational structure are you comfortable with? What tradeoffs will you accept between organizational structure and goal satisfaction?

Also it seems to me that when a product needs to be resold by its consumers, like religion or Amway, that means the product probably isn't any good.

Does rationalists telling other people about rationality make rationality worse?

If Eliezer's ideas about solving confusing problems actually helped, some of the many researchers who read LW would've found them useful and told us about it. And if the sequences were as useful in everyday life as they are well-written, a lot of people would have demonstrated that convincingly by now.

It seems to me, though, that the Mormon style of proselytizing is learning about prospective buyers, figuring out how to make their lives better, and then helping implement that. The benefit of that system is you have a knowledgeable person finding the highest-value tip for potential customers, which gets a lot more customers than having your catalog available at the public library and letting them find what suits them best. The question is not whether or not the tip is effective, but who pays to find that out.

Comment author: cousin_it 10 May 2011 09:53:55PM *  13 points [-]

What community organizational structure are you comfortable with? What tradeoffs will you accept between organizational structure and goal satisfaction?

I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.

Comment author: wedrifid 10 May 2011 10:03:38PM 5 points [-]

I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.

Everything cousin_it says in this thread assume I said it as well.

Comment author: AdeleneDawner 10 May 2011 09:58:26PM 0 points [-]

All my goals are well served by my job or my informal social network.

Then why are you here at all? Or is LW included in the latter? (It's actually at the 'formal group' end of my social experience, but I may be more abnormal than I think...)

Comment author: cousin_it 10 May 2011 10:05:26PM *  13 points [-]

Yeah, LW is included in the latter. As far as I can tell, it doesn't yet require me to tell others what to do or be told what to do :-) For me it's a place to hang out with smart people and talk about interesting things, like an intellectual nightclub.

Comment author: wedrifid 10 May 2011 10:09:21PM 5 points [-]

For me it's a place to hang out with smart people and talk about interesting things, like an intellectual nightclub.

One that doesn't have bouncers at the door to prevent guys coming in unless they bring chicks with them, to keep the balance...

Comment author: XFrequentist 10 May 2011 03:28:08PM *  4 points [-]

Which specific techniques do you dislike?

Comment author: Davorak 10 May 2011 05:14:46PM 2 points [-]

What techniques/methodologies do you use to divide techniques you approve of and those you disapprove? Stepping through this process with one of the techniques in the OP's post would clear up much of why you would leave lesswrong if they became prominent.

Comment author: AlanCrowe 10 May 2011 12:08:33PM 11 points [-]

I think that you are responding to the post at too low a level of abstraction.

One of the pleasures of reading Isaac Asimov's Foundation is the scene in which Harry Seldon demonstrates the power of Psychohistory by designing an organisation, a variation of the Womens's Institute, that grows and grows and grows. (And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?). Ones intuition is that there are laws governing the dynamics of society and the rise and fall of empires. Can we discern enough of those laws to make sure that rationalism flourishes?

calcsam's post contains much food for thought, and I envy his plain, simple prose style. Hence my praise.

Reading between the lines to detect and reject a suggestion that we turn Less Wrong into a church along the line of Latter-day Saints is a mistake. Food for thought should be digested not puked up.

Some rationalists come unstuck by becoming addicted to alcohol. Do we want a no-drinking rule, like the Latter-day Saints have? I don't think we should even be asking the question. I don't think it is helpful to draw parallels so closely. But we don't have to just drop the issue.

We are aware of over-confidence bias. Perhaps that is related to alcoholism. It is no secret that alcohol is can be dangerously addictive. Any-one who lives in a big city sees the down and outs living on the street. And yet the problem of alcoholism self-limits at a fairly high level, as though people can see the danger and are over-confident of their own ability to avoid it.

It would be cool if rationalists could win in the sense of the rationalist community having a low rate of alcoholism. We are not interested in trying to do that by having a no-drinking rule, but that doesn't mean we are not interested in trying by other means.

Suppose, for the sake of argument, that over-confidence bias (I can handle it, it won't happen to me) is part of the cause of alcoholism. The means of combating alcoholism that align with our values is to find ways of bridging the gap between an academic understanding (that people in general are over-confident) and a personal understanding (that maybe I need to cut back my drinking because I am over-confident about being able to handle it.) I don't think any-one feels threatened by a community norm of helping each other to bridge the academic/personal gap.

Another example is the Latter-day Saints rule against coffee. Not for us, obviously. On the other hand we are into self-experimentation. Try it and see. Nullius in verbia. That makes us a bit different. Most people who try giving up coffee to see if it makes their own life better face ribbing from their friends. The rationalist community could be more supportive than that. So we can look at calcsam's post and be inspired to build a community around our own values and see how others, with different values, go about it.

Comment author: RichardKennaway 10 May 2011 01:57:07PM 8 points [-]

And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?

It's the short story "Snowball Effect" by Katherine MacLean. Collected in this anthology edited by Asimov.

Comment author: calcsam 10 May 2011 03:04:07PM 4 points [-]

Thanks Alan. You get my point. But even if rationalists don't adopt any LDS norms -- and I don't expected them/us to (I'm hovering at the edge of the community at the moment) -- understanding that the framework here should be helpful.

Comment author: Cayenne 10 May 2011 09:23:22AM *  3 points [-]

I'm not interested in making this into a church-like group either. Some norms are useful (e.g. respond to words with words and never ever with bullets), while others (you must marry a rationalist? you must never do $list-of-sins or else?) would be abhorrent to me.

Helping people is a good goal to have. Regimenting their lives is not a goal I find appealing at all. I have enough trouble with my own life, why would I want to be in charge of other's lives too? Ick!

Edit - please disregard this post

Comment author: AdeleneDawner 10 May 2011 09:43:46AM 8 points [-]

I agree with the sentiment, here, but I also think it's a bit of a knee-jerk reaction, and that with a bit of work we can come up with some norms that we're already using that we'd like to spread. Rationalist taboo comes to mind, and actually updating based on evidence, and generally changing one's behavior to match one's beliefs. That last one seems to require a bit more give and take than just handing someone a set of rules, but I think that's a good thing, and we could streamline the process by coming up with a list of common beliefs and behavioral implications thereof (cryo, for example).

Comment author: Cayenne 10 May 2011 10:04:04AM *  7 points [-]

It's not a knee-jerk reaction, but more like a visceral rejection. The thought of this community becoming something with the feel of the church I grew up in makes me feel sick, and if it happened I would walk away and never look back. This is certainly a bias, but I would still do it.

We need to have a clear and concise list of taboos, skills, and beliefs that we want to make into norms, and then the whole community has to talk them over and make sure that we really, really want to make them into our norms. If we're going to start adopting practices from other groups, I believe this should be our highest priority.

Edit - please disregard this, it's needlessly prescriptive.

Comment author: AdeleneDawner 10 May 2011 10:32:46AM 3 points [-]

It's not a knee-jerk reaction, but more like a visceral rejection. The thought of this community becoming something with the feel of the church I grew up in makes me feel sick, and if it happened I would walk away and never look back. This is certainly a bias, but I would still do it.

I noticed your comment about being ex-Mormon after I wrote the grandparent. Even without that context, but especially with it, this is reasonable, and a good warning signal for us to look out for. Please try to give us a heads-up if we start getting close to that point, too, ok?

In a more general sense, I do think it's important to keep the look-and-feel of any organizational structure we put together different from the general look-and-feel of churches. I see a few advantages to this, most obviously that it will avoid driving away non-rationalist atheists and it will help remove us from direct competition with churches so that people who 'already have a religion, thanks' don't see that as a reason not to check us out.

We need to have a clear and concise list of taboos, skills, and beliefs that we want to make into norms, and then the whole community has to talk them over and make sure that we really, really want to make them into our norms. If we're going to start adopting practices from other groups, I believe this should be our highest priority.

Absolutely.

Also note: None of the above is intended as an actual endorsement of any particular plan. We should figure out what we might do before we decide whether to do that thing, as far as I'm concerned, and we're still in the first stage of that.

Comment author: Cayenne 10 May 2011 10:49:43AM *  8 points [-]

Well, to be slightly more clear, I am trans-gender. This is a sin in the LDS church, since the surgeries and hormones 'desecrate my temple' (temple == body). There is a limit to how much discrimination against ANY group I can stand before I leave, even if that group is 'those people that want to kill us because we don't believe in their god'.

At the same time, I really dislike the idea that I might be keeping a group from succeeding by giving negative input. I'm fairly likely to just withdraw without much fanfare if I decide that that is what's happening, since I hate drama and making a scene about something like that would feel like an attempted hostage situation. Just no.

Since I believe in the 'step-forward' method of getting things done, I'll start a discussion now to try to codify our norms.

Edit - please disregard this post, especially the last part. Empirical testing shows that I am not good at this kind of thing.

Comment author: NancyLebovitz 10 May 2011 12:09:40PM 2 points [-]

Could you expand on the things about LDS that you don't want to see replicated among rationalists?

Comment author: Cayenne 10 May 2011 12:22:44PM *  4 points [-]

Some of it is difficult to pull apart into clear thought, but I'll try.

I don't want to have a list of groups I have to hate to belong. I don't want to have someone trying to control my behavior by defining things as 'sin'. I don't want to be told 'we love you, we just don't like your actions', when it's clear that there is no love involved in any case. I don't want to have to remember people and feel sorry that they're part of a malignant memeplex, and that I can't do anything to help them. I don't want to dread going to a meet because I don't fit in.

No, I really don't like the LDS church. That's probably never going to change, though I'll try not to influence others' decisions on the matter. I don't hate the members, I just feel sad when I think of them, and of my ex-family.

Edit - please disregard this post

Comment author: Swimmer963 10 May 2011 12:34:28PM *  1 point [-]

I don't hate the members, I just feel sad when I think of them, and of my ex-family.

That makes me sad too. I don't have a particularly negative attitude towards religion (alll my personal interactions with religions and religious people have been pretty positive and haven't included any of the aspects on your list) but I hear stories like yours about the incredibly toxic things people can do with their religions, and it depresses me, mostly because I don't think it's purely a symptom of people being religious. Otherwise, how could nearly all the religious people I've met be more accepting and less hypocritical about their daily life decisions than my atheist-by-default friends? It's more a symptom of people being flawed humans, and that is depressing.

Comment author: NancyLebovitz 10 May 2011 02:59:51PM 0 points [-]

How much of what you don't like about LDS is entangled with the organizational structure?

I don't have a strong answer, just some concerns.

It may be that a lot of what's wrong there is having a hard boundary between members and non-members. If so, rationalists may be able to beat that one by wanting people to be more rational, though there do seem to be some firm lines in this community, like being obligated to be a materialist.

I don't want to have to remember people and feel sorry that they're part of a malignant memeplex, and that I can't do anything to help them.

You may be stuck with that one, especially in regards to cryonics, at least in the sense that you can't do much to help them.

Comment author: Swimmer963 10 May 2011 03:09:56PM 0 points [-]

If so, rationalists may be able to beat that one by wanting people to be more rational, though there do seem to be some firm lines in this community, like being obligated to be a materialist.

I'm not sure what you mean by the word "materialist" in this context. Could you explain?

Comment author: Barry_Cotter 10 May 2011 03:40:23PM *  0 points [-]

Materialist as in reductionist, as in Thou Art Physics not as in being materialistic.

Comment author: Swimmer963 10 May 2011 04:01:21PM 0 points [-]

Ok (sigh of relief). That's what I thought but I'm not used to seeing the word used to mean 'reductionist'.

Comment author: Cayenne 10 May 2011 03:09:32PM 0 points [-]

It is hard to say.

I have no doubt that rationalists will prevail eventually, and I wish luck to the ones that try.

Edit - please disregard this post

Comment author: Vaniver 10 May 2011 02:07:37PM 1 point [-]

This is certainly a bias, but I would still do it.

Isn't this the sort of thing you come here to overcome? Or am I just thinking of our sister site?

Comment author: Cayenne 10 May 2011 02:16:34PM 1 point [-]

It would have to be something I would want to overcome. I came here because the sequences were fascinating to read, but I find more and more that I simply can't consider myself to be rational in any meaningful way. I probably should try to overcome it, I suppose.

Comment author: David_Gerard 10 May 2011 09:51:55AM *  6 points [-]

Yes, I think it's a case of codifying the norms that are already held. Taking care to watch out for verbal overshadowing and similar errors.

The fears expressed by cousin_it and Cayenne are quite reasonable ones to hold, since every cause wants to be a cult. Perhaps take care to facilitate schisms and stay on good terms with them, to avoid the rules ossifying into detached levers and hence lost purposes?

Not that examples spring to mind. Perhaps annual Mixed Rationalist Arts tournaments, to keep a reality check in play.

(Heck, I think that one's a great first step, if we can work out what to compete on.)

Comment author: Vladimir_Nesov 10 May 2011 09:45:06AM *  7 points [-]

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.

Implementation of what? What's the purpose of these hypothetical norms? There's no point in propagating arbitrary norms. You are describing it backwards.

Comment author: Swimmer963 10 May 2011 01:37:42PM 2 points [-]

But rationalism doesn’t have a well-defined set of norms/desirable skills to develop

I'm not sure that "norms" are the same as "desirable skills to develop". The LessWrong community definitely has a list of desirable skills: improve understanding of Bayes, for example.Maybe not well-defined though.

Comment author: AdeleneDawner 10 May 2011 11:17:23AM 2 points [-]

Implementation of what?

Honestly, I think I would have balked if calcsam had offered specific answers to this question, rather than the general principle of deriving them from the "theology". He is a relative outsider, and I think this is something we should be hashing out for ourselves.

Comment author: Vladimir_Nesov 10 May 2011 01:20:46PM *  4 points [-]

I'm not sure it makes sense to build an idea around a premise that there is a central problem of propagating norms before we have some examples of norms which should be propagated.

Comment author: Eneasz 10 May 2011 03:21:56PM 3 points [-]

What about things like "overcome your biases", "raise the sanity waterline", and "win the stars"?

Comment author: Dorikka 10 May 2011 03:55:54PM 0 points [-]

Or, rather, norms which achieve these goals. These don't seem low-level enough to be norms in themselves.

Comment author: Davorak 10 May 2011 05:19:47PM 0 points [-]

I don't know the LDS example "Your purpose on earth is to become like God" is pretty big. Big goals are good if they are back up with supportive low-level goals.

Comment author: wedrifid 10 May 2011 05:10:06PM -1 points [-]

These don't seem low-level enough to be norms in themselves.

You can't get all that much higher than 'win the stars'! :P

Comment author: Cayenne 10 May 2011 11:24:08AM *  0 points [-]

It seems that the proper answer to this is to develop our norms in a rational manner, and reject arbitrary norms that have no purpose.
Edit - please disregard this post

Comment author: AlanCrowe 10 May 2011 08:01:15AM 4 points [-]

This clear and brief essay on the mechanics of community building is a valuable contribution to Less Wrong. Thank you.

Comment author: Bongo 10 May 2011 03:10:01PM *  1 point [-]

There's nothing to "implement" in the sequences. They're philosophical blog posts.

OTOH, I guess you could find something to implement from any text if you thought hard enough. But I don't think the sequences straightforwardly mandate any particular way of life or something to implement.

(I've been editing this comment after posting it, sorry for confusion.)

Comment author: jasonmcdowell 12 May 2011 10:49:03PM *  5 points [-]

I'm going to start having kids in a few years. I have my eye of some of the sequenences - such as Mysterious Answers to Mysterious Questions. I need to find a way to distill this stuff down, so I can teach it to my children.

Comment author: Eneasz 10 May 2011 03:15:28PM 8 points [-]

The number of people on this site striving to become more rational, with the sequences as a foundation, suggests otherwise.