Comment author: ChristianKl 03 April 2014 08:57:23PM 0 points [-]

Yup - although in the case of EA, that's still likely to be a very slow process.

I don't think that's a problem. We don't need to think in terms of corporate quarter results but are free to think more long-term.

Focusing on quick fixes isn't what successful movement building is about.

Comment author: mbitton24 04 April 2014 02:44:54AM -3 points [-]

To clarify, I also don't think EA has much potential as a social movement even if marketed properly. Specific EA beliefs are much more spreadable memes down the line IMO.

Comment author: ChristianKl 02 April 2014 11:04:35PM 0 points [-]

The best way to recruit is to have people who are passionate enough about a subject that they tell their friends.

Comment author: mbitton24 02 April 2014 11:28:35PM -2 points [-]

Yup - although in the case of EA, that's still likely to be a very slow process. This isn't the sort of thing that can go viral. It takes months or years of cultivating before someone transfers from complete outsider to core member.

Comment author: peter_hurford 02 April 2014 08:14:57PM -1 points [-]

So you expect movement building / outreach to be a lot less successful than community building ("inreach", if you will)?

Yes, especially if the same strategies are expected to accomplish both. They're two very different tasks.

I wouldn't say the same tasks will work equally well for both. But I do think either would have spillover effects for the other. Right now, it seems we're focused on community building, though.

~

I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don't think you'll convince the same person of all of these things. Rather they'll be individuals that are on board with specific parts and that may or may not identify with EA.

I'd be interested in how much overlap there are between these groups. It never was my intention to try and convince people of the entire meme set at once, but I wouldn't rule it out as implausible. I think better understanding these channels (how people come to these beliefs) is most important.

Comment author: mbitton24 02 April 2014 10:07:54PM -2 points [-]

If you're talking about recruiting new EAs, it sounds like you mean people that agree enough with the entire meme set that they identify as EAs. Have there been any polls on what percentage of self-identifying EAs hold which beliefs? It seems like the type of low-hanging fruit .impact could pick off. That poll would give you an idea of how common it is for EAs to believe only small portions of the meme set. I expect that people agree with the majority of the meme set before identifying as EA. I believe a lot more than most people and I only borderline identify as EA.

Comment author: peter_hurford 02 April 2014 03:12:08AM 2 points [-]

In order to recruit new EAs, your pitch will almost definitely have to downplay certain areas that many core EAs spend lots of time thinking about.

I think "core" EAs understand and are comfortable with that, so they won't feel alienated.

-

As long as your goal is to increase that number [of core EAs], you're going to see very low recruitment rates.

Some of this comes down to what counts as an "EA". What kind of conversion do we need to do, and how much? I also think I'll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I'd like to know, and things I believe would be valuable to know.

-

If you want more "total altruistic effort," go convince people to show more altruistic effort.

So you expect movement building / outreach to be a lot less successful than community building ("inreach", if you will)?

Comment author: mbitton24 02 April 2014 02:26:52PM -2 points [-]

So you expect movement building / outreach to be a lot less successful than community building ("inreach", if you will)?

Yes, especially if the same strategies are expected to accomplish both. They're two very different tasks.

Some of this comes down to what counts as an "EA". What kind of conversion do we need to do, and how much? I also think I'll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I'd like to know, and things I believe would be valuable to know.

I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don't think you'll convince the same person of all of these things. Rather they'll be individuals that are on board with specific parts and that may or may not identify with EA.

Comment author: ChristianKl 02 April 2014 01:36:02PM *  0 points [-]

There's a reason why for-profit organizations do this - it actually works.

It costs them huge advertising budgets and gets less effective as time goes by.

For-profit organization actually do this because they don't have a cause to rally around. Making more money for shareholders isn't giving anyone a feeling of community.

Steve Jobs got rid of focus groups telling him what the people want and build products to fit his own standards. As a result Apple has managed to develop a strong brand.

And if you won't agree with a worldview, you aren't going to join the community just because it's active.

But you might switch to self identifying yourself as EA because there are people on Skillshare doing nice things for you without asking for something in return.

That self identification will then improve the chances that you are doing other things to advance EA. It helps with retention.

Comment author: mbitton24 02 April 2014 02:15:45PM -2 points [-]

I'm saying it helps with retention but barely at all with recruitment - and that it may even get in the way of recruitment of casual EAs. I don't think Skillshare favours will make people want to self-identify as EA. Only a minority of people even require the sorts of favours being offered.

Comment author: mbitton24 01 April 2014 10:03:42PM -1 points [-]

"A stronger community for the effective altruist movement should better encourage existing EAs to contribute more and better attract new people to consider becoming EA. By building the EA Community, we hope to indirectly improve recruitment and retention in the effective altruist movement, which in turn indirectly results in more total altruistic effort, in turn resulting in more reduced suffering and increased happiness."

I'm going to predict that .impact struggles to meet this objective.

I think you're taking a naive view of how movement building works.

I think you need to see the distinction between retaining and recruiting members as analogous to the tension between a core and casual fan base. In order to recruit new EAs, your pitch will almost definitely have to downplay certain areas that many core EAs spend lots of time thinking about. That way, you'll bring in a lot of new people that, for example, buy the argument that you should donate to the charity that provides the most bang for your buck and yet still, for example, have zero interest in AI or animals. If you refuse to alienate core EA member values in order to get more casual EAs (e.g. people that donate to GiveWell's top charities and give a bit more than average) then, well, that's admirable, I guess, but you're movement building won't go anywhere. There's a reason why for-profit organizations do this - it actually works.

The amount of people that share most EA values is going to remain low for a very long time. Increasing that number wouldn't involve "recruitment" as much as it would involve full-on conversion. As long as your goal is to increase that number, you're going to see very low recruitment rates. Most people aren't on the market shopping for new worldviews - but individual new beliefs or values, maybe. And if you won't agree with a worldview, you aren't going to join the community just because it's active.

If you want more "total altruistic effort," go convince people to show more altruistic effort. Trying to movement build a group as complex and alienating as EA by strengthening its internal ties will dissuade most outsiders from wanting to join you. Pre-existing communities can be scary things to self-identify with.

You know how some parents make their kids try cigarettes at a young age so that they'll hate it and then not want to smoke when they're older? Well, a website like Brian Tomasik's is like that for most potential EAs. Way too much, too soon.

Comment author: mbitton24 17 March 2014 04:24:57PM 0 points [-]

Cool. Will there be a lot of overlap with Intuition Pumps and Other Tools for Thinking? Based on your description, it sounds like Dennett just wrote this book for you.

Comment author: jaibot 25 February 2014 10:40:16AM 1 point [-]

Not just magical thinkers. I heard Massimo Pigliucci making the same "this isn't definitive and therefore it tells us nothing" argument on the most recent Rationally Speaking podcast.

Comment author: mbitton24 28 February 2014 05:11:21AM -2 points [-]

You're right. I think scientific thinkers can sometimes misinterpret skepticism as meaning that nothing short of peer-reviewed, well-executed experiments can be considered evidence. I think sometimes anecdotal evidence is worth taking seriously. It isn't the best kind of evidence, but it falls above 0 on the continuum.

Comment author: mbitton24 27 February 2014 05:09:43AM -1 points [-]

The good news is that our higher cognitive abilities also allow us to overcome depression in many situations. In Stumbling on Happiness, Daniel Gilbert explains how useful it is that we can rationalize away bad events in our lives (such as rejection). This capability, which Gilbert refers to as our psychological immune system, explains why people are able to bounce back from negative events much more quickly than they expect to.

Comment author: JQuinton 24 February 2014 06:54:34PM 8 points [-]

They basically stopped short of calling the scientific method a cultural construct

I had this problem recently too, and my solution was to not mention "science" in and of itself, but mention heuristics based on probability. It's much harder to argue that math is a social construct. If you can explain how biases fail using probability theory it might go over a lot better.

Comment author: mbitton24 24 February 2014 07:47:15PM 15 points [-]

I think speaking in terms of probabilities also clears up a lot of epistemological confusion. "Magical" thinkers tend to believe that a lack of absolute certainty is more or less equivalent to total uncertainty (I know I did). At the same time, they'll understand that a 50% chance is not a 99% chance even though neither of them is 100% certain. It might also be helpful to point out all the things they are intuitively very certain of (that the sun will rise, that the floor will not cave in, that the carrot they put in their mouth will taste like carrots always do) but don't have absolute certainty of. I think it's important to make clear that you agree with them that we don't have absolute certainty of anything and instead shift the focus toward whether absolute certainty is really necessary in order to make decisions or claim that we "know" things.

View more: Next