Dale McGowan writes:

In the past seven years or so, I’ve seen quite a few humanistic organizations from the inside — freethought groups, Ethical Societies, Congregations for Humanistic Judaism, UUs, etc. Met a lot of wonderful people working hard to make their groups succeed. All of the groups have different strengths, and all are struggling with One Big Problem: creating a genuine sense of community.

I’ve written before about community and the difficulty freethought groups generally have creating it. Some get closer than others, but it always seems to fall a bit short of the sense of community that churches so often create. And I don’t think it has a thing to do with God.

The question I hear more and more from freethought groups is, “How can we bring people in the door and keep them coming back?” The answer is to make our groups more humanistic — something churches, ironically, often do better than we do.

Now I’ve met an organization founded on freethought principles that seems to get humanistic community precisely right. It’s the Brooklyn Society for Ethical Culture [...], host of my seminar and talk last weekend, and the single most effective humanistic community I have ever seen.

So what do they have going for them? My top ten list:

Read on at Meming of Life

New to LessWrong?

New Comment
28 comments, sorted by Click to highlight new comments since: Today at 1:42 PM

If you're wondering how good a link has to be to make it to the front page, this is a good example.

If there are known best practices and you fail to follow them - not from having a better way, but from simple negligence - then you can't really complain when you fail, can you?

Do you mean (1) a good example for someone wondering where the threshold is (hence, a link-post that's only just good enough for promotion), or (2) a good example for someone wondering how a mere link-post could possibly be good enough, or (3) both at once because only by bumping against the theoretical limit of maximal goodness could a link-post merit promotion?

One thing might be worth mentioning. To most religions, helping others is one of their shared core values. This means that everyone joining a church can expect (even on a rational level) that he recieves a warm welcome. As the communities themselves also feel they ought to be warm to newcomers, that is what usually happends too.

The problem rationalists are facing is that the community is essentially dog eat dog where everyone tries their best to scrutinize others thoughts (as this is the "rational" thing to do) and then to bash the hell out of them for every small detail missed because "this is critical for becoming more rational".. although more often than not it can probably be ascribed to just that person wanting to show that he is smarter and a "better rationalist". And people fear this criticism, even when it might not materialize. They know that in order to be accepted, they need to be really rational. This sets rather high standards on self confidence for joining.

What religious groups have over us is that for them the competition to be a better christian/muslim/jew doesnt interfere but rather helps the community forming. Rationalists have it the other way around and frankly I see no way to cure this while still remaining a "rationalist group". If anyone can solve it, or at least offer possible solutions, this might help the cause greatly.

In view of #1, maybe LW should have a "welcome new members" thread?

Right now LW would be absolutely impenetrable to a newcomer. Perhaps we would be able to maintain a wiki without getting into arguments if we just used it as a glossary?

I agree with ciphergoth that LW is inpenetrable to a newcomer. Even if they could penetrate, the whole downvoting thing is about as unwelcoming as you can get.

I can only conclude that LW is not meant to increase the number of rationalists, it's meant to get ideas on how to increase the number of rationalists.

I'm not sure I'd even go that far yet, but the site is young; with the right software we could eventually create a resource that was a little less impenetrable. If we end up having a sister wiki (an idea I argued against but am now coming around to) it would be good if it were easy to wikilink in comments, so you could say something like "one-box" in a comment and have it link to what we are referring to.

[-][anonymous]15y10

Great article. However, there’s one more item that needs to be added to the list to really differentiate us from religious institutions, and that item needs to become the new #1: Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate. Do this as well as items #1 thru #10 and we’ll be more popular than any existing religion. The kind of transformation religion offers used to work OK, but not so much anymore because the religious world-view hasn't been updated to reflect current knowledge. People sense that religion doesn’t speak very well to the world as we now know it through science, but they don’t know where else to go.

Most UU (Unitarian-Universalist) congregations already practice items #2 thru #10, and many are working hard on getting better at #1. Maybe that explains why UUs have 1,094 groups, way more than any other class of skeptics. But, so far, UU's don't have a good plan for Transformation either (I've been a UU Humanist for 25 years).

[-][anonymous]15y10

Great article. However, there’s one more item that needs to be added to the list to really differentiate us from religious institutions, and that item needs to become the new #1: Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate. Do this as well as items #1 thru #10 and we’ll be more popular than any existing religion. The kind of transformation religion offers used to work OK, but not so much anymore because the religious world-view hasn't been updated to reflect current knowledge. People sense that religion doesn’t speak very well to the world as we now know it through science, but they don’t know where else to go.

Most UU (Unitarian-Universalist) congregations already practice items #2 thru #10, and many are working hard on getting better at #1. Maybe that explains why UUs have 1,094 groups, way more than any other class of skeptics. But, so far, UU's don't have a good plan for Transformation either (I've been a UU Humanist for 25 years).

Great article. However, there’s one more item that needs to be added to the list to really differentiate us from religious institutions, and that item needs to become the new #1: Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate. Do this as well as items #1 thru #10 and we’ll be more popular than any existing religion. The kind of transformation religion offers used to work OK, but not so much anymore because the religious world-view hasn't been updated to reflect current knowledge. People sense that religion doesn’t speak very well to the world as we now know it through science, but they don’t know where else to go.

Most UU (Unitarian-Universalist) congregations already practice items #2 thru #10, and many are working hard on getting better at #1. Maybe that explains why UUs have 1,094 groups, way more than any other class of skeptics. But, so far, UU's don't have a good plan for Transformation either (I've been a UU Humanist for 25 years).

Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate.

Do we really need to simulate drugs to be taken seriously?

logi said, "Do we really need to simulate drugs to be taken seriously?"

Yes, so long as the competition offers something like it. I accepted Jesus Christ as my personal Lord and Savior at age 12, and it was a lasting emotional high that only recently have I been able to reproduce (at age 50). At age 12, my behavior and life really did improve considerably (as well as those around me) for a long period of time, until I finally backslid. I become rational at age 20 (and was quite ashamed of the personal Lord and Savior thing), and wasn't able to reproduce the postive effects that I experienced at age 12 until only very recently, at age 50. Fortunately, I was able to do so without giving up rationality. In fact, rationality is key to a lasting Transformation.

Interesting. I rejected belief in God and the afterlife also at age 12, because I realized that the only reason I believed those things were true was that it made me feel good to believe them. It was a lasting intellectual high knowing that I had overcome my base emotions with some clear thought.

Emotional manipulation is like propaganda. One of things I found most striking about OB was the high signal-to-noise ratio. No base emotional pleas, no Giant Implicit Moral Framework silently guiding the discourse. I'd hate to see this community lose that by stooping to emotionally drugging its members in an attempt to "replace church".

Can't we just tell people the truth? You want transformation? Take psilocybin. You want to feel like things are more meaningful? Smoke marijuana. You want willpower and IQ? Take amphetamines. These all carry risks, just as religious emotional manipulation does. At least they don't soak up precious time and words.

I guess it comes down to the epistemic/instrumental divide. I'm here for the truth, not for a bag of naive attempts to stimulate dopamine release.

logi's comments are in quotes.

"Emotional manipulation is like propaganda".

I'm not talking about emotional manipulation. I'm talking about a healthy emotional life, one that provides authentic happiness. Emotions are wonderful so long as they're guided by reason.

"Can't we just tell people the truth?" Yes, absolutely!

Apparently logi has given up on the idea the transformation, meaning, and willpower are achievable without doing long-term damage to oneself. I totally disagree, it is possible if we just open up our thinking a little bit.

"I'm here for the truth" So am I!

I'm talking about a healthy emotional life, one that provides authentic happiness.

I don't think I know what you mean by "healthy emotional life" or "authentic happiness" here.

I'm not talking about emotional manipulation.

But earlier, you said to the question of simulating drugs:

Yes, so long as the competition offers something like it. I accepted Jesus Christ as my personal Lord and Savior at age 12, and it was a lasting emotional high that only recently have I been able to reproduce (at age 50).

Drugs are a direct form of emotional and cognitive manipulation, so what I mean by "simulating drugs" is to achieve something similar.

Emotions are wonderful so long as they're guided by reason.

I never claimed otherwise. I actually even value some emotions that aren't "guided by reason". But I certainly try not to let any of them in turn guide my reason.

Apparently logi has given up on the idea the transformation, meaning, and willpower are achievable without doing long-term damage to oneself.

No, I think you miss my points. One, I'm saying this stuff is totally orthogonal to what I find valuable in a rational community. This in the end must be a personal objection, but I am probably not alone in it.

Two, I'm as skeptical of the content of these "transformations" and "meanings" as I am of their drug-induced counterparts, regardless of their long-term harm or other drawbacks.

We probably share a concept of "willpower", and it's probably true that there are generally effective and sustainable techniques superior to drugs.

I totally disagree, it is possible if we just open up our thinking a little bit.

I don't know what this means.

Even if all you want is to improve your own accuracy of belief, the more that people around you want the same thing, the better you can correct for your mistakes (c.f. Aumann). To that end, it is helpful to find ways to make rationalism more approachable and welcoming to people who are unlikely to experience such an intellectual high, a group that the evidence suggests encompasses most people.

the more that people around you want the same thing

Do they really want the same thing? It's hard to tell when we're also offering other tasty treats. Prostitutes will do things for cocaine that they wouldn't do otherwise, and pretend to enjoy themselves.

As for Aumann, Robin Hanson points out that we'd often do better to update toward "average beliefs" than the beliefs of our chosen in-group. So it appears I can already maximize my "Aumann benefit" by conversing with random strangers. It seems to me that the benefits of a rationalist group are precisely the opposite: Discover good arguments and important data that we were previously unaware of, that we find convincing regardless of source. If we value our own mere opinions too highly, we've already lost.

Do they really want the same thing? It's hard to tell when we're also offering other tasty treats. Prostitutes will do things for cocaine that they wouldn't do otherwise, and pretend to enjoy themselves.

Maybe they would, maybe they wouldn't. But if rationalism doesn't at least offer something comparable to other options, many people won't even try.

As for Aumann, Robin Hanson points out that we'd often do better to update toward "average beliefs" than the beliefs of our chosen in-group. So it appears I can already maximize my "Aumann benefit" by conversing with random strangers. It seems to me that the benefits of a rationalist group are precisely the opposite: Discover good arguments and important data that we were previously unaware of, that we find convincing regardless of source. If we value our own mere opinions too highly, we've already lost.

Robin's argument in that link seems to be that taking pleasure in disagreement with average beliefs is, all else equal, a bad thing; it's certainly not an argument in favor of updating toward average beliefs. Aumann agreement only strictly applies to ideal rationalists with shared assumptions, but as a rule of thumb one should update toward other agents' beliefs based on the demonstrated rationality of their belief-forming process.

"Maybe they would, maybe they wouldn't. But if rationalism doesn't at least offer something comparable to other options, many people won't even try."

So why should we want to attract such people?

We know why cult groups usually try to attract as many people as possible: they're just raw material to them, explicitly or implicitly.

How is it to our benefit to adopt an r-strategy, rather than a K?

Among other reasons, because mass opinion often influences decisions, e.g. politics, in ways that impact everyone, including us. The greater the average rationality of the masses, the better those decisions are likely to be.

Rational arguments, being restricted to sanity, are un-optimized for swaying masses for political gain.

It's not a good idea to fight irrationality's strengths with rationality's weaknesses.

But if rationalism doesn't at least offer something comparable to other options, many people won't even try.

True. I think the goal here is a bit more complex than "maximize number of self-proclaimed rationalists", though.

Robin's argument in that link seems to be that taking pleasure in disagreement with average beliefs is, all else equal, a bad thing; it's certainly not an argument in favor of updating toward average beliefs.

I was presenting it as an argument favoring updates toward average beliefs over doing so for in-group beliefs, but you're still right that it's really making an unrelated point.

Aumann agreement only strictly applies to ideal rationalists with shared assumptions, but as a rule of thumb one should update toward other agents' beliefs based on the demonstrated rationality of their belief-forming process.

I find such demonstrations quite difficult to identify. Doing so requires both confidence in the correctness of their conclusion and, to a lesser extent, confidence that the beliefs you observe aren't being selected for by other rationalists.

I have no interest in joining a church, period. It doesn't matter to me whether that church spouts theistic nonsense or humanistic nonsense. I'm interested in what groups teach and what they practice, not in their rituals or atmosphere.

Certainly a rationalist group could avail themselves of techniques that make people feel good about the group. But people who join the group for the sake of those feelings, or who wouldn't join if their feelings carefully massaged, aren't rationalists. Bringing those people into the fold can only distract us from what's important and dilute the message. Syncretism requires a sacrifice of the essential nature of at least one of the two incompatible things associated.

So your reasoning is that this group has some similarity to churches, and churches spout nonsense, so this group spouts nonsense? Or is it that churches spout nonsense, so you don't want to join a church, so you don't want to join a group that looks like a church, regardless of whether it spouts nonsense?

"So your reasoning is that this group has some similarity to churches, and churches spout nonsense, so this group spouts nonsense?"

Not at all. They identify themselves as humanist.

And if you're interested in what groups do rather than how they do it, you're in a vast minority. Good for you - you don't have to join a church, even a rationalist one! Nobody's making you!

But people have emotions. It's not 'rational' to ignore this. As Eliezer says, and clarifies in the next post, rationalism [is/is correlated with/causes] winning. If the religious get to have a nice community and we have to do without, then we lose.

Yes, I would like to join a community of people very much like a church, but without all the religious nonsense. I'm pretty sure I'm not alone in this.

Bringing those people into the fold can only distract us from what's important and dilute the message.

This asserts a fact not in evidence, to wit, that people can't learn and change.

If it's not the organization that causes them to learn (the right things) and change (to become more rational), we have no grounds for expecting that those things will happen.