Related to: Lessons from Latter-day Saints, Building Rationalist Communities overview, Holy Books (Or Rationalist Sequences) Don't Implement Themselves

My thesis:

It doesn’t matter what ideas are conveyed on Less Wrong, or in LW meetings -- the subset that matters is what group members resolved to do. Discussion of these 'resolves', and people's experience doing them, is useful in creating an expectation that people level up their skills.

Intelligent discussion of ideas is always refreshing. But translating that into action is more difficult.

Our learned reflexes are deep. They need to be overridden. How? Practice.

One woman I taught in India, we’ll call her Girija, was 35 years old, extremely intelligent and really wanted to change her life but had incredibly low levels of self-confidence. Every time we met Girija, we’d have a really sharp discussion, followed by her pouring her heart out to us. It was the same every time, and though we enjoyed the visits, and the food she would feed us, she never seemed to be getting anywhere.

If she really wanted to fundamentally change her life, our weekly meetings weren’t enough. (Similarly, weekly meetups are a good start, but if you really want to be learning rationality you should be practicing every day.)

We felt that if Girija spent some time every day with her 9 year old daughter and live-in boyfriend, reading the scriptures together, they would be happier. We explained this to her frequently, and she said she would start -- but she never did it.

One week, through cleverly calling Girija and chatting for 10 minutes every day, we got her to do it. After the week was over, we asked her how it went.

“You know, it was really good,” she said. “Sandeep and I have been getting along a lot better this week because we did that.”

It was like a light had turned on in her head. Because we followed up, she did it, and was far more motivated to do more things afterwards.[1] 

Let me give two simple examples of goal, project, and follow-up.[2]

  • GOAL: To become better at noticing logical fallacies as they are being uttered
  • PROJECT: A certain Less Wrong group could watch a designated hour of C-SPAN -- or a soap opera, or a TV show -- and try to note down all the fallacies.
  • FOLLOW-UP: Discuss this on a designated thread. Afterwards, compile the arguments and link to the file, so that anyone in the LW community can repeat this on their own and check against your conclusions. Reflect communally at your next LW meeting. 
  • GOAL: To get into less arguments about definitions.
  • PROJECT: “Ask, "Can you give me a specific example of that?" or "Can you be more concrete?" in everyday conversations.” Make a challenging goal about how much you will do this – this is pretty low-hanging fruit.
  • FOLLOW-UP: Write instances in your journal. Share examples communally at your next LW meeting.

I came up with these in about five minutes. Having spent more time in the community than me, you will all be able to generate more and better possibilities.

Some points about Projects:

  • Here are some ideas that can easily be made into Projects. Thanks commenters on the last post.
  • Projects don't have to be group-based, but groups motivate doing stuff.
  • Projects should be more short than the above linked posts. The above Goal/Project/Follow-Up kernels are 85 and 57 words, respectively. Brevity is key to implementation.
  • There is currently no central database of Rationality Projects or people's experiences trying to implement them. (Correct me if I'm wrong here.)
  • Feedback on implementation is essential for improving practices.

Finally, a really 'low-cost' way to make a project and follow up. Right before the conclusion of a Less Wrong group, give everyone a slip of paper and ask them to write down one thing they are going to do differently next week as a result of the discussion. For two minutes (total) at the beginning of the next meeting, let people tell what they did.

Some notes and warnings:

Doing this in a fraternalistic manner, not a paternalistic manner, will be a key to success.[3] Community agreement that We Should Do This is important before launching a Project.

Beware of the following tradeoff:

  •  implementing Projects will alienate some people. Even if projects are determined by consensus, there will be some people who don’t want to do any Project, and they will feel marginalized and excluded.
  • not implementing Projects, people will improve their Rationality skills at a far slower pace. [4] You will thus run afoul of Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works." But ultimately, commitment drives growth. More leadership to organize stuff, more people bringing friends, and so on.

I will discuss this more later, along with possible solutions. Latter-day Saints, with a large emphasis on doing things, have high levels of commitment; however, there are definitely people who would come to church more if they were expected to do less.

Please post any ideas you have for Projects in the comments.


[1] Even subtracting the religious element, common goals reduce conflict.

[2] Here are some keys to following up that I learned. In two years, I probably applied this on about 600 people:

  •  Following up is mere nagging (and equally ineffective) unless the person/group actually wanted to do the task in the first place.
  • Congratulating people when they  did  do something was far more important than expressing disappointment when they didn’t do it – the 80/20 rule applies.
  • I often felt afraid to ask someone if they had done what they promised to do, because they probably hadn’t, and I didn’t know what I should say then.
  • But awkwardness is contagious; if you act awkward when talking to someone, the other person will feel awkward too. Be genuinely excited, and they will also reflect this. 
  • It’s all about how you ask the question. “How did you like reading X?” is far better than “Did you read X?”. Use humor and make the task seem easy to do.
  • Don’t be self-righteous; actively deprecate yourself if necessary.
  • Each person has different ways they like – and don’t like – being followed-up with.

[3] Coming from my experience as a Latter-day Saint missionary, my personal examples are all fairly paternalistic. With tweaks, they can all be made fraternalistic. The sentiment has been  expressed  that “I don’t like people telling me what to do”; this will avoid that pitfall.  

[4] I say 'far slower' based on my missionary experience. When people were dedicated to specific projects, they seemed to improve a lot faster.

New to LessWrong?

New Comment
81 comments, sorted by Click to highlight new comments since: Today at 8:00 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Thank you, calcasm, for this sequence, and apologies in advance to everyone for this being a bit of a rant and likely having been said before. I fear that the very practical suggestions are going to be lost because people's brains are being overridden by a combination of:

  • He's using examples and techniques from LDS, who are evil cultish religious people using these techniques for evil cultish reasons!
  • These techniques are designed to get someone to be inclined to change, or even worse to identify as members of a community and perhaps even adopt beliefs more often or more effectively than they would have by pure logic and reading blog posts. Dark Arts!

This big danger that Less Wrong is going to turn into a cult is a phantom. It has always been a phantom. Starting a cult whose core teaching is essentially "Think for yourself, schmuck!" together with techniques for doing so effectively may be the worst idea for a cult in world history.

If there is a cult here, not that I think there is, it is the cult of pure reason that thinks it is a sin to use any technique that could possibly reinforce a false belief or a behavior we might dislike, the cult of people crying out "... (read more)

9SilasBarta13y
But still good enough if we're not careful.
3Nornagest13y
SilasBarta has already pointed out the obvious counterexample; a variety of other vaguely cultish institutions, such as Anton LaVey's Church of Satan, also share goals that are, if not identical, then certainly within spitting distance. More importantly, though, I don't think "think for yourself, schmuck" is actually a core teaching of LW; LW-rationality is aimed at achieving success (however defined) rather than maintaining some ideal of independent thought, and it'll happily discourage any ideas, however independent, that its group consensus agrees are maladaptive. This isn't even a bad thing; independence for its own sake is best left to the Discordians. I don't think LW is a cult or in serious danger of becoming one in the near term. But its core ideals alone are not sufficient to ensure that it doesn't have the abstract potential to become one.
3zntneo13y
I wonder if we had someone who always disagreed with us if it would help prevent he cultishness. I know there is evidence that doing so increases decisions and i think i remember there being evidence that it helps stop groupthink. So maybe if we implemented what CalcSam said but add that someone be designated as the person who must disagree with us (this person could be different people or the same person)
4hamnox13y
Like the subverter in a paranoid debate? I think that would actually be really useful, or at least a lot of fun (which has a use in and of itself.) I would stipulate that it NOT be just one person, though. There ought to multiple people, trading off to diffuse attention, or whoever is designated could easily become a strawman effigy to be mocked and denounced. The story in my head goes: Every once in a while, an ace of spades (if we can get it custom, red or gold on black would be an epic color scheme), will be discretely slipped to a randomly selected acolyte or two before the meeting has begun. These people have license to be as contrary and anti-consensus as they can get away with without being found out. It will be given away at the end of the meeting, unless they'd like the actions and statements they made that day to stand as-is...
1beoShaffer13y
I like this suggestion but might tweak it a bit to say that everybody draws from a deck of cards(or some similar method) instead of trying to slip cards just to a specific person. It seems easier and doesn't create the problem of the person doing the slipping knowing who the subverter is. Also, it is easy to repurpose if we need other randomly assigned positions.
0juliawise13y
There's someone at the meetup I attend who draws that card every week. Is the purpose of this exercise for the others to guess drew the contrary card? If so, what is this good for?
6wnoise13y
To make sure it is a different someone. It's very easy for us to mentally categorize someone as "Oh, that's just old crazy uncle Eddy. No reason to actually consider his arguments seriously". And it's also useful for people to gain practice at dissenting.
0zntneo13y
Seems close. I don't see why it couldn't be just one person randomly selected.

not implementing Projects, people will improve their Rationality skills at a far slower pace. [4] You will thus run afoul of Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works."

This seems to equate "improving Rationality skills" with "identifying with the group". I find this frightening, and a step towards just using "rationality" as something in the name of which to grub power, influence and followers and as a flag to rally a generic community around. Maybe that's the function of religious teachings for religious communities, but I hope not for LW.

1Costanza13y
I would have thought so too, but a lot of people here are obviously loving this stuff.
1Bongo13y
And that's kind of frightening too. I don't think it's too much of an exaggeration to say that this stuff is basically a cult roadmap.

I think we should be a little careful of using the word "cult" as a mental stop sign, since that does seem to be what's happening here. We need to be a bit more careful about labeling something with all the bad connotations of a cult just because it has some of the properties of a cult -- especially if it only seems to have the good properties. But... that doesn't mean that this good cult property won't lead to the bad cult property or properties that we don't want. You should just be more explicit as to what and how, because I'm wavering back and forth on this article being a really, really good idea (the benefits of this plan are obvious!), and a really, really scary and bad idea (if I do it, it'll make me become part of a groupthinky monster!).

The problem I have is that both sides in my own head seem to be influenced by their own clear cognitive biases -- we have the cult attractor on one hand and the accidental negative connotations and stopsigny nature of the word "cult" on the other. So if you could semi-explicitly show why adopting the idea this article puts forth would lead to some specific serious negative consequences, that would clear up my own indecision and confusion.

6Bongo13y
I'm afraid my comments were mostly driven by an inarticulate fear of cults and of association with a group as cultish as Mormons. But one specific thing I already said I'm afraid of is that of LW becoming a "rational" community instead of a rational community, differing from other communities only by the flag it rallies around.
5gscshoyru13y
You know what... I was missing the "look for a third option" bit. There are more options than the two obvious ones -- doing this, and not doing this. I've been having trouble making myself do the rationalisty projects that I came up with for myself, and this article suggested a way to use group pressures to make me do these things. Since I really wanted a way to make myself do these projects, the article seemed like a really, really good idea. But instead of making the whole community do this, I can actually just ask some fellow rationalists at my meetup to do this, just to me. That way I can use group pressures to help impose my own rationally determined second order desires on myself. The only thing I think I lose this way is the motivation via a sense of community, where everyone else is doing it too... Of course, this still doesn't resolve the problem of whether or not the community at large should adopt the ideas put forth in this article. I still can't seem to think rationally about it. But at least this is a way for me to get what I want without having to worry about the negative side effects of the whole community adopting this policy.
1jimrandomh13y
If you took a typical community and replaced its flag with one that said "be rational", would you expect the effect to be positive, negative, and neutral?

I don't really know, but I'll note that Scientologists are known to laud "sanity", and Objectivists were all about "reason".

You might think that a belief system which praised "reason" and "rationality" and "individualism" would have gained some kind of special immunity, somehow...?

Well, it didn't.

It worked around as well as putting a sign saying "Cold" on a refrigerator that wasn't plugged in.

Rationality flags don't seem to help that much.

2Desrtopa13y
No, I wouldn't.
5fubarobfusco13y
Conjecture: Sufficiently dedicated groups that do not take measures against "bad cult properties" will fall down the cult attractor. So if you want a group to not fall down the attractor, you have to think about bad cult properties and how to avoid them. Various folks have come up with lists of just what "bad cult properties" are; one of my favorites is Isaac Bonewits' "ABCDEF". Bonewits' motivation appears to have been to help people be more comfortable involving themselves in unusual groups (he was a neopagan leader) by spelling out what sorts of group behavior were actually worth being worried about. I won't repeat Bonewits' list here. I think it's worth noting, though, that several of the properties he outlines could be described as anti-epistemology in practice.
8AdeleneDawner13y
Having read the list, LW-as-it-was-last-week-and-presumably-is-now seems to be unsurprisingly good at not being a cult. It does occur to me that we might want to take a close look at how the incentives offered by the group to its members will change if we switch to a more recruitment-oriented mode, though.
1wedrifid13y
See also.
0mutterc13y
Yeah, I figured I wasn't going to be too worried about LW's cultishness unless/until rules for sexual behavior got handed down, to which Eliezer was exempt.
0[anonymous]13y
Could you deconstruct this for me? I kind of understand where you're coming from, but I'd like to see your exact reasoning.
0EStokes13y
All else equal, I think a person who identifies as a person who tries to be rational would make more rational decisions than one that does not. (Downvoted :S?)
[-][anonymous]13y80

(I'm not sure where to ask this, so I'll just put it here.)

Do you have any experience with doing this kind of thing online-only? I currently don't have any rational community around and I'm not even sure if I want one, but the benefits seem tremendous and I'm interested in at least trying.

1handoflixue13y
I'll second the interest in some form of online community / meet-up, and feel a bit silly for not thinking of the idea previously :)
1[anonymous]13y
(Not specifically a reply to you, but it feels weird replying to myself. Just some brainstorming.) Well, there's a Virtual Meetup, but that's video-based. I find that very intimidating, so I'd be interested in something less real-time. I liked the idea of regular checkups (related talk), so maybe something mail-based like that might work. Incidentally, I recently bought a Zeo and it comes with some basic coaching. They regularly send you mail, helping you come up with some meaningful goals, asking you about your progress and so on. I really enjoyed that and it helped me track my sleep consistently so far. I'll think about that more and maybe start something myself.
1handoflixue13y
Thinking out loud: If the issue is simply video, one could pretty easily run an IRC meetup. It has the advantage of still being somewhat "real time" and thus something you could block out an hour or two for a weekly "meetup." If you want to avoid real-time entirely, then a forum or mailing list ought suffice. I'd point out that LessWrong is already effectively a forum, so I think you'd get a lot less out of this. All of these, including the video conference, probably also run in to scaling issues unless you have a very good organizer and a crowd willing to go along with it. I'd expect small groups of 3-6, splintering as needed, would probably be best? I suppose mostly it depends on what you're looking to get out of it, and why the main LessWrong site doesn't suffice.
0erratio13y
So far most of them have been voice-based. I'll also add that I'm the kind of person who's typically intimidated by the idea of discussing things in realtime (I feel like I don't generate points or counterpoints fast enough most of the time), but I've found the virtual meetups to be worthwhile and non-intimidating.
0[anonymous]13y
Sounds good. I will re-consider joining the virtual meetup.

Finally, a really 'low-cost' way to make a project and follow up. Right before the conclusion of a Less Wrong group, give everyone a slip of paper and ask them to write down one thing they are going to do differently next week as a result of the discussion. For two minutes (total) at the beginning of the next meeting, let people tell what they did.

This is a really good idea. I've enjoyed your series of posts and I think you have a lot of really good ideas.

8AngryParsley13y
That tactic combines commitment and consistency with social proof. After 5 people have told the group what honorable and high-status things they're going to do, you'd have a hard time saying, "Well I didn't learn anything useful tonight, but it was fun to catch up with some of you guys." even if it were true.
4fubarobfusco13y
This suggests that it would inspire making things up to sound good, regardless of whether they are true. I don't think that's a hugely great result.
3AngryParsley13y
That's the point I was trying to make. I'm sorry if it came across as endorsing the tactic. "Commitment and consistency" and "social proof" are two of the six "weapons of influence" from Cialdini's Influence: The Psychology of Persuasion.
1mutterc13y
Maybe. Being a lone dissenter is likely high status amongst LWers.
2handoflixue13y
Hmmmm, that's a good point. I like the idea of hanging out casually and growing at my own pace. I like the idea of learning skills that help me accelerate that pace. I definitely dislike any sort of social pressure to match the group's pacing, though.

Reflect communally at your next LW meeting

Share examples communally at your next LW meeting.

For two minutes (total) at the beginning of the next meeting, let people tell what they did.

At first this reminded me of one of the more obnoxious LDS commitment techniques: encouraging everyone to "bear [sic] their testimony". In short, whenever there is a gathering of Mormons, they're periodically (in the context of a monthly meeting) pressured to ALL make some public commitment to belief in the church or value of the community. This pressure is expli... (read more)

7Alex Flint13y
You seem to be arguing that using a community gathering to motivate people is inherently bad because in the past it has been used by wicked people for wicked deeds. This is a very dangerous stance to take as it can potentially prohibit any technique that has even been used for evil deeds. Some of the early scientology meetings were held on boats - does that mean we should never ever allow ourselves to hold a LW meetup in a boat?
1Jonathan_Graehl13y
Maybe it seems that way, but I'm not. I'm saying that if the expectation is close enough toward "everybody affirms the effectiveness of the technique they were told to practice last week" as opposed to "a few people who are most excited can share what worked/didn't about it", you'll distort people's thinking.

This is an excellent plan. Excellent writing, organization, thought. This is a rally-point for implementation.

It makes me uneasy when I see competent missionaries. I don't know if I have the energy to compete against them.

The idea of brevity, giving weekly assignments, and discussing them at the next meeting makes me think of "Agile software development" practices in general. The goals of rational self-improvement and agile software development seem to align fairly neatly, too.

It has the added advantage that it scales very well: You can use these techniques to manage a group, or just yourself. The ability to "go it solo" if needed seems important to this crowd.

I'm going to set a goal of having a post on this by May 22nd, to try and motivate myself to think about it more and see if I can't come up with some applied thoughts :)

I really appreciate your advice about doing things. I like doing things almost as much as I like not doing things. Doing is important and we as a community should do more things. But......... ideas! It turns out that one of the weird things about this universe is that ideas might actually be more important than actions. That the very fate of the light-cone -- whether or not orthodox Mormons get their 1+ planets (and whether or not it is "simulated" or "real")-- depends on what may turn out to be slight differences in the things humanity chooses to do, which might boil down to surprisingly small insights from individuals.

Ideas.

4MartinB13y
Ideas + Doing. Each on its own is not particularly useful.
8Costanza13y
That's not exactly how I'd put it. If the history of the world for the past hundred years or so teaches anything, it's that energetic, enthusiastic, and active people can be dedicated and loyal to very, very bad ideas. This has had rather unpleasant results on more than one occasion. Getting the ideas right is important.
2MartinB13y
And then they are bright people with correct ideas that never bother to do anything with or about them or even write them up. It is not enough to be right and stop there.
4Costanza13y
Agreed! With that said, I submit that the ideas of the Mormon church are not correct. They are not remotely right. Better they should stop before proceeding to the "doing" phase.
0MartinB13y
That is not completely correct. There is no absolute wrong in what the Mormons do. There is also no way to first become absolute right, and then start action. There is a continuum of wrongness. Sometimes you got to act before being correct, like in some cases where you act against an evil. With a closer look you might find things the Mormons do that are better than the actions of common society. Even if they do so for mistaken reasons. Not using drugs comes to mind. Some religious groups take that serious. And of course the idea of being awesome to your kids and family. In case that actually applies to a higher degree to Mormons. If they actually get to 'STOP' what they do at the moment, HOW will that take place? There are many ways to do the break of off a religion wrong.
4Desrtopa13y
I would argue otherwise. This may be the case morally speaking, but if you're applying standards by which this is true of everyone, then the claim is fairly vacuous. If you're speaking evidentially, then I would argue that yes, they're processing data in a way that is absolutely wrong. And of course, there are plenty of wholesome, happy Mormon families. But I've known enough bitter ex Mormons with horror stories that I must treat the idea that Mormonism improves people's family lives in general with extreme skepticism. If a "closer look" tells us that some norms lead to happier or more productive lives, and some have negative repercussions, isn't that closer look better taken before establishing the norms?
0MartinB13y
The argument also works for Christian families and other religious groups. I am vary to label big parts of the population as inherently evil. While I would enjoy religion to just disappear there has to be some thinking on what it will be replaced by. It can easily be made worse. The devil you know and such.
0JamesAndrix13y
There is a definition of terms confusion here between "inherently evil" and "processing data absolutely wrong". I also get the impression that much of Europe is an extremely secular society that does OK. There is confusion for individuals transitioning and perhaps specific questions that need to be dealt with by societies that are transitioning. But in general there is already a good tested answer for what religion can be replaced by. Getting that information to the people who may transition is trickier.
0[anonymous]13y
what

Good advice, I'm actually looking to start some similar projects. As you said feedback is very important, but for some of us it's difficult to find rationalists in our area to share these experiments. I would like to see some sort of online group where we can share and discuss practical ideas, or get advices from time to time. A forum would probably be enough, and I can create one if there's enough interest.