Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations. Strictly speaking, this is irrelevant to the message, but being confused about one's aims somewhat lowers my trust in one's suggestions. This is not to say that the suggestions themselves are suspicious or clearly wrong - on the contrary, they are written in an impartial style that very well fits into LW customs, which makes me even more curious about the author's background.
On a slightly different note, there is only so much one can improve on the organisational level, and we should keep in mind what expectations do we have from this community. In a sense, I second the Vladimir Nesov's and cousit_it's comments. Community building is instrumentally important, but really shouldn't become a major terminal value if we want to maintain a high level of rationality. Aspiring rationalists are in a danger of wandering in a strange circle: at first, they crave for getting rid of common biases and developing abilities for efficient truth seeking. But then, these very abilities lead them to discover th...
Reading this, I wonder why a LDS missionary got interested in a rationalist community which is generally hostile to religion. I would appreciate some explanation about the author's motivations.
Because there are things I can learn here. I can handle the hostility to religion. But if you don't cross-pollinate, you become a hick.
It doesn't seem to me to be possible to hold both rationality and religion in one's head at the same time without compartmentalization, which is one of the things rationality seeks to destroy.
I can actually quite easily accept that it could be a good idea for rationalists to adopt some of the community-building practices of religious groups, but I also think that rationality is necessarily corrosive to religion.
If you've squared that circle, I'd be interested to hear how. Being somewhat religious for the social bit but having excised the supernaturalism is the only stable state I can think of.
compartmentalization, which is one of the things rationality seeks to destroy
Yes, in the ideal limit, rationalists don't compartmentalize. But decompartmentalizing too early, before you learn the skill of sanely resolving conflicts between beliefs and values in different compartments (rather than, for example, going with the one you feel more strongly), is a way to find the deepest, darkest crevices in the Valley of Rationality.
I also think that rationality is necessarily corrosive to religion.
I would agree that there is a level of rationality at which religious beliefs become impossible, though to the extent that a religious person takes their religion seriously, and expects effective rationality to produce accurate beliefs, they should not expect this until it actually happens to them. Though it does occur to me that helping rationalists to establish the level of social support available within the Mormon community (as calcsam is doing) is an effective way of establishing a line of retreat for a religious person who is uncertain about retaining their religious beliefs.
Well, there are seven formal UU values:
This is a list of applause lights, not a statement of concrete values, beliefs, and goals. To find out the real UU values, beliefs, and goals, one must ask what exact arrangements constitute "liberty," "justice," etc., and what exact practical actions will, according to them, further these goals in practice. On these questions, there is nothing like consensus on LW, whereas judging by the uniformity of ideological positions espoused on the Unitarian/UU websites, there does seem to be a strong and apparently unchallenged consensus among them.
(To be precise, the applause lights list does include a few not completely vague goals, like e.g. "world peace," but again, this says next to nothing without a clear position on what is likely to advance peace in practice and what to do when trade-offs are involved. There also seems to be one concrete political position on the list, namely democratism. However, judging by the responses seen when democracy is questioned on LW, there doesn't seem to be a LW consensus on that either, and at any rate, even the notion of "democracy" is rather vague and weasely. I'm sure that the UU folks would be horrified by many things that have, or have historically had, firm democratic support in various places.)
Can you can point to something specific in the UU literature that makes you feel that they're less tolerant to dissent than LessWrong?
Before I even click at a link to a Unitarian Universalist website, I know with very high probability that there is going to be a "social justice" section espousing ideological positions on a number of issues. And for any such section, I can predict with almost full certainty what precisely these positions will be before I even read any of it.
Now, the UU folks would probably claim that such agreement exists simply because these positions are correct. However, even if I agreed that all these positions are correct, given the public controversy over many of these issues, it would still seem highly implausible that such ideological uniformity could be maintained in practice in a group highly tolerant of dissent. In contrast, I see nothing comparable on LW.
You say:
...LessWrong has this going for it as well, though: there's a strong thread of anti-religion bias, and I'd say there's a moderate pro-cryonics/singularity bias. I don't see a lot of posts about how SIAI is a waste of time and money, or how Christianity is really misunderstood and we
Or did you mean the kind of lpoliicies that count as "left wing" in the US, and liberal/moderate/centre-left everywhere else.
"Everywhere else"? I hate to break the news, but there are other places under the Sun besides the Anglosphere and Western Europe! In most of the world, both by population and surface area, and including some quite prosperous and civilized places, many UU positions would be seen as unimaginably extremist. (Try arguing their favored immigration policies to the Japanese, for example.)
You are however correct that in other Western/Anglospheric countries, the level of ideological uniformity in the political mainstream is far higher than in the U.S., and their mainstream is roughly similar to the UU doctrine on many issues, though not all. (Among their intellectual elites, on the other hand, Unitarian Universalism might as well be the established religion.)
In any case, I didn't say that the UUs had the most extreme left-wing positions on everything. On the contrary, what they espouse is roughly somewhere on the left fringe of the mainstream, and more radical leftist positions are certainly conceivable (and held by some small numbers of people). What is significant for the purposes of this discussion is the apparent ideological uniformity, not the content of their doctrine. My points would hold even if their positions were anywhere to the left or right of the present ones, as long as they were equally uniform.
I can certainly fail to handle them effectively.
If 'spirituality' helps you to handle these things effectively, that is empirically testable. It is not part of the 'non-falsifiable' stuff. In fact, whatever you find useful about 'spirituality' is necessarily empirical in nature and thus subject to the same rules as everything else.
Most of the distaste for 'spirituality' here comes from a lack of belief in spirits, for which good arguments can be provided if you don't have one handy. If your 'spirituality' has nothing to do with spirits, it should probably be called something else.
I can't have "incorrect" goals or emotions
You can have goals that presuppose false beliefs. If I want to get to Heaven, and in fact there is no such place, my goal of getting to Heaven at least closely resembles an "incorrect goal".
This raises an interesting question -- if a Friendly AI or altruistic human wants to help me, and I want to go to Heaven, and the helper does not believe in Heaven, what should it do? So far as I can tell, it should help me get what I would want if I had what the helper considers to be true beliefs.
In a more mundane context, if I want to go north to get groceries, and the only grocery store is to the south, you aren't helping me by driving me north. If getting groceries is a concern that overrides others, and you can't communicate with me, you should drive me south to the grocery store even if I claim to want to go north. (If we can exchange evidence about the location of the grocery store, or if I value having true knowledge of what you find if you drive north, things are more complicated, but let's assume for the purposes of argument that neither of those hold.)
This leads to the practical experiment of asking religious peop...
This sequence still feels like it is privileging the hypothesis of the desirability of LDS organizational practices, but you make a good point. We lack condensed introductions. Eliezer handed out copies of the Twelve Virtues of Rationality in the past, but I don't remember any other attempts at pamphlets or booklets. How much can the material be compressed with reasonable fidelity?
Some possible booklet ideas:
There's a lot to learn from your posts and I appreciate your effort. But speaking for myself, if the LW community follows this route en masse, it'll make me sad and eventually drive me away. I don't want to be associated with anything that uses such techniques to spread.
I think that you are responding to the post at too low a level of abstraction.
One of the pleasures of reading Isaac Asimov's Foundation is the scene in which Harry Seldon demonstrates the power of Psychohistory by designing an organisation, a variation of the Womens's Institute, that grows and grows and grows. (And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?). Ones intuition is that there are laws governing the dynamics of society and the rise and fall of empires. Can we discern enough of those laws to make sure that rationalism flourishes?
calcsam's post contains much food for thought, and I envy his plain, simple prose style. Hence my praise.
Reading between the lines to detect and reject a suggestion that we turn Less Wrong into a church along the line of Latter-day Saints is a mistake. Food for thought should be digested not puked up.
Some rationalists come unstuck by becoming addicted to alcohol. Do we want a no-drinking rule, like the Latter-day Saints have? I don't think we should even be asking the question. I don't think it is helpful to draw parallels so closely. But we don't have to just drop the issue.
We are aware ...
And one of the shocks of re-reading Foundation is that the scene isn't actually in that book. Where is it?
It's the short story "Snowball Effect" by Katherine MacLean. Collected in this anthology edited by Asimov.
I don't want to be associated with anything that uses such techniques to spread.
Which techniques are you referring to? Techniques developed by the LDS, or techniques that have some other property?
I saw two concrete techniques in this post. "Help people implement positive changes through regular interaction" and "use brevity in your promotional materials." I can see problems with each of them, but I cannot see why those problems make them negative overall.
For the first, you might argue that you want Less Wrong to only comprise self-starters. I would direct you to the posts about akrasia, specifically the ones which detail how social influence can cut down on akrasia.
For the second, you might argue that we only want people with the high IQs and available free time who can get through the Sequences as they are. While the first might be desirable, I'm unsure about the second- Are the Sequences really the shortest they could possibly be? Should we really be attracting the people who are willing to read tens of thousands of words before getting results, instead of proving our positive effects to others before asking them to commit?
My biggest concern is that simply spreading the memes will be counterproductive if they are not spread properly (partial understanding would be a huge concern). If everyone is just guessing the teacher's password, we will have probably done more damage than good.
I too have reservations about this material. But, I suspect that your recoiling and mine might stem from the representativeness heuristic - that anything associated with religion or its propagation techniques is Dark. I have a lot of negative associations there, and I'm betting many here do. However, religion's propagation techniques are unquestionably powerful, and we should be able to learn from them.
To counteract this bias somewhat, imagine having some system - I don't know the details yet, and neither do you - that would easily introduce the core concepts of rationality to anyone interested, and would make it easy to adopt those personal practices that one found positive or useful. Would this be a good thing?
I think so, yes, and clearly so -- if improved decision-making, easier willful action, and clearer analysis were already widespread, we'd all live with fewer of other people's bad decisions. Certainly, the clarity I've learned here has helped me; I'm willing to bet it'd help others as well.
If spreading rationality -- or, at least, making rationality more palatably available -- is a good thing, then we should as a community figure out what works to do that. If certain specific meme-spreading techniques are Dark or noxious, then we should identify what's wrong in them, and see if we can eliminate them without making the technique wholly ineffective.
So, can you identify what parts of those techniques are noxious to you?
The OP's previous post described a model of rationalist communities where you have "distillers" and "organizers" telling people to do stuff, some of which will be proselytizing. But I don't like being told what to do or telling others what to do, especially if it's proselytizing. So I would have no place in a such a community.
Also it seems to me that when a product needs to be resold by its consumers, like religion or Amway, that means the product probably isn't any good. Imagine Steve Jobs using MLM to sell the iPhone! If Eliezer's ideas about solving confusing problems actually helped, some of the many researchers who read LW would've found them useful and told us about it. And if the sequences were as useful in everyday life as they are well-written, a lot of people would have demonstrated that convincingly by now. In either case we would have an iPhone situation and would beat customers off with a stick, not struggle to attract them.
ETA: this comment is a joint reply to Vaniver, XFrequentist, fiddlemath and Davorak, because their questions were quite similar :-)
Okay, but also imagine Steve Jobs trying to sell the original iPod or iMac (before Apple's huge rebound in popularity) with no TV ads, billboards, posters, product placements, press releases, slogans, over-hyped conferences, or presence in retail stores; just a website with a bunch of extremely long articles explaining what's great about Apple products.
What community organizational structure are you comfortable with? What tradeoffs will you accept between organizational structure and goal satisfaction?
I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.
I don't have any goals that would be well served by joining an authoritarian volunteer community. All my goals are well served by my job or my informal social network.
Everything cousin_it says in this thread assume I said it as well.
Yeah, LW is included in the latter. As far as I can tell, it doesn't yet require me to tell others what to do or be told what to do :-) For me it's a place to hang out with smart people and talk about interesting things, like an intellectual nightclub.
I agree with the sentiment, here, but I also think it's a bit of a knee-jerk reaction, and that with a bit of work we can come up with some norms that we're already using that we'd like to spread. Rationalist taboo comes to mind, and actually updating based on evidence, and generally changing one's behavior to match one's beliefs. That last one seems to require a bit more give and take than just handing someone a set of rules, but I think that's a good thing, and we could streamline the process by coming up with a list of common beliefs and behavioral implications thereof (cryo, for example).
Yes, I think it's a case of codifying the norms that are already held. Taking care to watch out for verbal overshadowing and similar errors.
The fears expressed by cousin_it and Cayenne are quite reasonable ones to hold, since every cause wants to be a cult. Perhaps take care to facilitate schisms and stay on good terms with them, to avoid the rules ossifying into detached levers and hence lost purposes?
Not that examples spring to mind. Perhaps annual Mixed Rationalist Arts tournaments, to keep a reality check in play.
(Heck, I think that one's a great first step, if we can work out what to compete on.)
Well, to be slightly more clear, I am trans-gender. This is a sin in the LDS church, since the surgeries and hormones 'desecrate my temple' (temple == body). There is a limit to how much discrimination against ANY group I can stand before I leave, even if that group is 'those people that want to kill us because we don't believe in their god'.
At the same time, I really dislike the idea that I might be keeping a group from succeeding by giving negative input. I'm fairly likely to just withdraw without much fanfare if I decide that that is what's happening, since I hate drama and making a scene about something like that would feel like an attempted hostage situation. Just no.
Since I believe in the 'step-forward' method of getting things done, I'll start a discussion now to try to codify our norms.
Edit - please disregard this post, especially the last part. Empirical testing shows that I am not good at this kind of thing.
But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.
Implementation of what? What's the purpose of these hypothetical norms? There's no point in propagating arbitrary norms. You are describing it backwards.
Just so you know, what you're advocating for LW are practices that have helped Christianity become a dominant and universalizing religion. Christians want everyone to be a Christian, that's basic to Christianity. Does, lets call it "rationalism", want everyone to be a rationalist? I guess that's a good question, and should be asked.
But lets also be mindful about how Christianity tries to attain a universal status:
"a strong focus on strengthening the family"
It is key that Christianity spreads within the family, and importantly, through generations. "Be fruitful and multiply" belongs here. You shouldn't have non-Christian members of the family.
"daily family prayer and scripture study"
Not just a strong family, but a strong Christian family. The ties of family should be used for religious purposes.
"sex only inside marriage"
Every natural human desire needs to be mediated with religious meaning and purpose, this makes people lustful for religion.
This is what you call "the basic package". The basic package has reasons for its existence, but not reasons that rationalists would necessarily agree with.
But rationalism doesn’t have a well-defined set of norms/desirable skills to develop.
Actually changing your mind, learning the simple math of various fields,and becoming more luminous seem to represent a set of desirable skills to me, though I admit that is far from comprehensive. See also the twelve virtues of rationality.
Note that I'm not anywhere close to finished with my series 'The Science of Winning at Life.' I've barely started.
Brevity is key to implementation.
I like this idea. It can be difficult to read sequences if post after post you already agree with the idea. It would be wonderful to have the sequences compressed so that it was possible to easily find the sections which are new ideas so reading could be more targeted.
This clear and brief essay on the mechanics of community building is a valuable contribution to Less Wrong. Thank you.
There's nothing to "implement" in the sequences. They're philosophical blog posts.
OTOH, I guess you could find something to implement from any text if you thought hard enough. But I don't think the sequences straightforwardly mandate any particular way of life or something to implement.
(I've been editing this comment after posting it, sorry for confusion.)
The number of people on this site striving to become more rational, with the sequences as a foundation, suggests otherwise.
Related to: Lessons from Latter-day Saints, Building Rationalist Communities overview
This is my basic thesis:
Using Eliezer’s levels scheme, these are the three descending levels on which belief systems operate: theology, norms, and implementation.
I’ll give some examples. Here’s a general example, again from the Latter-day Saints:
Here’s one that I often dealt with as a missionary:
I did both of these (with different people), and they worked.
Norms and Implementation
As a missionary for the Church, my basic role was to:
There’s a lifestyle change here.
The “basic package,” (my terminology), which is a prerequisite to joining the church, includes: a strong focus on strengthening the family, daily family prayer and scripture study, the aforementioned health code, and sex only inside marriage. The glue is weekly church attendance, ensuring membership in a community that shares the same values.
After the “basic package,” it gets a bit more complex, as there are lots of higher-level elements of this lifestyle. To sample a few in no particular order:
Obviously these are different than rationalist norms, but my point is that they are fairly comprehensive. Though each topic is fairly regularly discussed in church, it’s impossible to implement them into your life all at once. It’s easy to seem overwhelmed by the flood of new information. (Sound familiar?)
And that is why we were there, to design mini-programs for each person.[3] We would isolate a couple of specific standards that would be effective for person X, and assist in implementation. If they liked it and wanted more, we helped them implement the “basic package” lifestyle.
This decision, that they liked it and wanted more, was the single most crucial decision that someone could make. It is directly related to Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works.” I will discuss this further on a subsequent post.
Okay, so how does this apply to Less Wrongians?
Less Wrong has its version of a theological framework – the Sequences. They give a comprehensive set of statements about the way the world works, drawn from evolutionary psychology, anthropology, Bayesian statistics, etc.
But rationalism doesn’t have a well-defined set of norms/desirable skills to develop. As a result, we Less Wrongians unsurprisingly also lack a well-developed practical system for implementation.
You may cite lukeprog’s guide. That’s good, but it’s only six posts. Less Wrong needs a lot more of it!
Or maybe you’ll say that if you read the Sequences carefully, etc, etc. Well, I did. Mysterious Answers to Mysterious Questions is 51 dense pages in Word, or about 25,000 words. This is an (extremely good) foundational text. It is not a how-to manual.[4]
Brevity is key to implementation.
For Latter-day Saints, the basic explanation of family standards is about 6000 words (95% of the important stuff is from page 4 to 15). The basic guide for teenagers is about 4000 words, and the basic guide for running a church organization is about 12000 words. And each one is very clear about what to do. (The teenage guide most clearly illustrates this point about brevity.)
The easiest way to begin building a how-to manual is for LW members to post specific, short personal examples of how they applied the principles of rationality in their day-to-day lives. Then they should collect all of the links somewhere, probably on the wiki.
If this sounds salesman-y or cheesy to you, or if you're extremely skeptical about religion, I quote a commenter on my last post. “If this works for people that are obviously crazy," said Vaniver, "that suggests it'll work for people who are (hopefully obviously) sane.”
[1] Admittedly, this also supports other norms, such as ‘marry another Latter-day Saint.’
[2]I’m not claiming this is perfect. Over the four years since I joined, I've encountered various amounts of ingroup snobbery, use of these standards to judge others, cliquishness, and intolerance towards certain groups, primarily gays. Plus all of the normal human imperfections.
[3] In designing and sequencing programs, we generally used a simple cost-benefit standard: how much will this help X vs. how much effort will it cost X?
[4] By comparison: the Bible is a foundational text of Christianity. The Purpose-Driven Life is a derivative how-to manual. This is a distillation of the Sequences, which is at least a start.