I won't be able to put much effort into the new forum, probably, but I'll express a few hopes for it anyway.
I hope the "Well-kept gardens die by pacifism" warning will be taken very seriously at the EA forum.
I hope the forum's moderators will take care to squash unproductive and divisive conversations about race, gender, social justice, etc., which seem have been invading and hindering nearby communities like the atheism/secularism world and the rationality world.
I hope the forum's participants end up discussing a well-balanced set of topics in EA, so that e.g. we don't end up with 10% of conversations being about AGI risk mitigation while 1% of conversations are about policy interventions.
Also, I think this "Well-kept gardens die by pacifism" post might be kind of a good illustration of a problem I have with how the sequences are regarded in general. The epistemological quality of this post seems pretty poor: although it's discussing phenomena that are best studied empirically (as opposed to phenomena that are best studied theoretically, like math), it cites no studies, and doesn't make an attempt to become a proto-study itself by trying to, say, find a method to do quasi-random sampling of online communities and figure out whether each community constitutes evidence for or against its thesis. Instead, its argument is based mainly on personal experience (even concrete examples in the form of actual specific anecdotes, like 4chan say, are few).
This phenomena could also have been productively studied theoretically: say, by making references to the expected quality of any given post, thinking about how frequently users are likely to return to a given forum and important it is for them to see new/valuable content each time they return, etc. But EY makes no attempt to do that either. (At least MBlume starts to think about modeling things in a more mathematical fashion.)
And yet it's a featured post voted up 91 points... as far as I can tell, largely on the strength of the author's charisma. I'm glad this post was written. I found it valuable to read; it has a couple novel arguments and insights. But it seems suboptimal when people cite it as if it was the last word on the question it addresses. And it seems weird that mostly on the strength of that post's advice, posts as epistemologically weak as it are no longer being written as often on LW any more. Tossing around novel perspectives can be really valuable even if they aren't supported by strong theoretical or empirical evidence yet.
I hope the forum's moderators will take care to squash unproductive and divisive conversations about race, gender, social justice, etc., which seem have been invading and hindering nearby communities like the atheism/secularism world and the rationality world.
To play devil's advocate: Will MacAskill reported that this post of his criticizing the popular ice bucket challenge got lots of attention for the EA movement. Scott Alexander reports that his posts on social justice bring lots of hits to his blog. So it seems plausible to me that a well-reasoned, balanced post that made an important and novel point on a controversial topic could be valuable for attracting attention. Remember that this new EA forum will not have been seeded with content and a community quite the way LW was. Also, there are lots of successful group blogs (Huffington Post, Bleacher Report, Seeking Alpha, Daily Kos, etc.) that seem to have a philosophy of having members post all they want and then filtering the good stuff out of that.
I think the "Well-kept gardens die by pacifism" advice is cargo culted from a Usenet world where there weren't ways to filter by quality aside from the binary censor/don't censor. The important thing is to make it easy for users to find the good stuff, and suppressing the bad stuff is only one (rather blunt) way of accomplishing this. Ultimately the best way to help users find quality stuff depends on your forum software. It might be interesting to try to do a study of successful and unsuccessful subreddits to see what successful intellectual subreddits do that unsuccessful ones don't, given that the LW userbase and forum software are a bit similar to those of reddit.
(It's possible that strategies that work for HuffPo et al. will not transfer well at all to a blog focused more on serious intellectual discussion. So it might be useful to decide whether the new EA forum is more about promoting EA itself or promoting serious intellectual discussion of EA topics.)
(Another caveat: I've talked to people who've ditched LW because they get seriously annoyed and it ruins their day when they see a comment that they regard as insufficiently rational. I'm not like this and I'm not sure how many people are, but these people seem likely to be worth keeping around and catering to the interests of.)
I like your comment, but this struck me as a bit odd:
(Another caveat: I've talked to people who've ditched LW because they get seriously annoyed and it ruins their day when they see a comment that they regard as insufficiently rational. I'm not like this and I'm not sure how many people are, but these people seem likely to be worth keeping around and catering to the interests of.)
Having one's day ruined because of one irrational comment is quite bizarre (and irrational). I don't think that people with such extreme reactions should be catered to.
I think the "Well-kept gardens die by pacifism" advice is cargo culted from a Usenet world where there weren't ways to filter by quality aside from the binary censor/don't censor.
Ah... you just resolved a bit of confusion I didn't know I had. Eliezer often seems quite wise about "how to manage a community" stuff, but also strikes me as a bit too ban-happy at times. I had thought it was just overcompensation in response to a genuine problem, but it makes a lot more sense as coming from a context where more sophisticated ways of promoting good content aren't available.
Awesome job, thanks a lot for putting in the effort to make this happen. My biggest concern is that without multiple people willing and able to pump out high quality content on a regular basis the forum won't achieve critical mass. Have you reached out to all the existing EA bloggers to see if they are willing to cross post on an ongoing basis, or at the very least if they object to someone else doing the grunt work and cross posting for them (this could maybe even be automated). This creates the additional problem of fragmented comment threads, but that is a smaller problem.
Thanks!
I agree that it's important to attract contributors. For now, I'm looking to attract bloggers' goodwill while curating content for the first month. Once I've attracted enough seed content, I should be able to start recruiting i) regular contributors and ii) people to use it to promote their local meetups. If people like the site, then the offer becomes more attractive as the site goes up. :)
So all else equal, I would expect that a forum devoted to effective altruism will tolerate uninformed posts better than a forum devoted to rationality would, because EAs want to spread effective altruism and one way of doing that is to educate people about it. I've previously hypothesized that LW has suffered from overly heavy-handed moderation that's stifled people from posting content... it seems like trying to have a culture of encouraging contributions could be useful here (even going so far as to disable the downvote button for the first few months after launch?)
Another idea is to target specific effective altruists who aren't already terrifically busy and ask them to write blog posts; I suspect some would be honored to be so asked and would take you up on the offer?
Personally, I think a big way to help ensure success is to not worry too much about drawing self-identifying "effective altruists", but primarily focus on simply drawing active altruists. Obviously, if it only recruits from LW and its handful of bosom-buddies, there'll hardly be a large-enough population. The mere title of the forum should do 70% of the work to keep everything on topic, and friendly reminders from mods should handle the rest.
I think that's important enough I'm going to stress it again: if the EA community is 80% LW-ians, then I think it will loose most of it's potential. LW already discusses effective altruism. Drawing people such as full-time active workers in existing charities such as public health works, economic support, missionary work, etc, seems a far higher priority to me. LW-like people already think along those lines, and generally have less energy/money dedicated to altruism than full-time charity workers. Attracting the later group would have a greater impact on the individuals, and target more important individuals to boot.
In that vein, I've already scoped out the blog, because I have several friends in mind who would take to this well. Currently, the front article has math. Lots of math. Here on LW that's almost the norm, but if the whole EA site is like that, it'll scare off a lot of good people. That's not to say "don't use math because people don't like it" - math is very important - but rather "decide what audience to target on your front page."
I'm excited! Thanks for putting it together.
If this will be a place for EAs to connect intellectually and share thoughts and resources, it should definitely be connected to both .impact and the Gittip platform for EAs donating to other EA's.
I would suggest immediately contacting Patrick about .impact and me or Ozzie about Gittip and the EA Funding Network.
I would also suggest spending 5 minutes thinking of other resource sharing platforms and locations you may want to connect to it.
The Effective Altruism Forum will be launched at effective-altruism.com on September 10, British time.
Now seems like a good time time to discuss why we might need an Effective Altruism Forum, and how it might compare to LessWrong.
About the Effective Altruism Forum
The motivation for the Effective Altruism Forum is to improve the quality of effective altruist discussion and coordination. A big part of this is to give many of the useful features of LessWrong to effective altruists, including:
The Effective Altruism Forum has been designed by Mihai Badic. Over the last month, it has been developed by Trike Apps, who have built the new site using the LessWrong codebase. I'm glad to report that it is now basically ready, looks nice, and is easy to use.
I expect that at the new forum, as on the effective altruist Facebook and Reddit pages, people will want to discuss the which intellectual procedures to use to pick effective actions. I also expect some proposals of effective altruist projects, and offers of resources. So users of the new forum will share LessWrong's interest in instrumental and epistemic rationality. On the other hand, I expect that few of its users will want to discuss the technical aspects of artificial intelligence, anthropics or decision theory, and to the extent that they do so, they will want to do it at LessWrong. As a result, I expect the new forum to cause:
At least initially, the new forum won't have a wiki or a Main/Discussion split and won't have any institutional affiliations.
Next Steps:
It's really important to make sure that the Effective Altruism Forum is established with a beneficial culture. If people want to help that process by writing some seed materials, to be posted around the time of the site's launch, then they can contact me at ry [dot] duff [at] gmail.com. Alternatively, they can wait a short while until they automatically receive posting priveleges.
It's also important that the Effective Altruism Forum helps the shared goals of rationalists and effective altruists, and has net positive effects on LessWrong in particular. Any suggestions for improving the odds of success for the effective altruism forum are most welcome.