TLDR: I had idea to apply some tools I learned on coursera to our community in order to grow it better. I wanted to start some organized thinking about goals our community has, and offer some materials for people who are eager to work on it, but are maybe lost or need ideas.
Yesterday I did a course on coursera.org. It's called "Grow to Greatness: Smart Growth for Private Businesses, Part I". (I play lectures often at x2.5 so I can do 5 weeks course in one day)
Though this course seems obvious, it'd say pretty worth 3 hours, so look it up. (It's hard to say how much is hindsight and how much is actually too easy and basic) I got some ideas sorted, and I saw the tools. I'm not an expert now, obviously, but at least i can see when things are done in unprofessional manner, and it can help you understand what follows.
When growing anything (company, community, ...) you have different options. You should not opt for everything, because you will be spread thin. You should grow with measure, so that people can follow, and so that you can do it right. This is the essence of the course. Rest is focused on ways of growing.
This was informative part of this article. Rest is some thoughts that just came to my mind that I would like to share. Hopefully I inspire some of you, and start some organized thinking about this community.
This community is some kind of organization, and it has a goal. To be precise, it probably has two goals, as I see it:
- to make existing members more rational
- to get more members.
Note that second focus is to grow.
I will just plainly write down some claims this course made:
In order to grow:
- your people need to grow (as persons, to get more skills, to learn).
- you need to create more processes regarding customers, in order to preserve good service
- you often need better organization (to regulate processes inside the company)
- you need to focus
- you need a plan
- if you need to stop the fire, stop the fire which has the greatest impact, and make a process out of it, so that people can do it on their own afterwards
1. I guess no-one is against this. After all, we are all here to grow.
2. My guess is that our customers could be defined as new members. So, first steps someone makes here are responsibility of this organization. After, when they get into rationality more, when they start working on themselves, they become employees. That's at least how it works in my head. Book on sequences is a good step here since it helps to have it all organized in one pdf.
3. this is actually where it all started. We are just a bunch of people with common drive to be more rational. There are meetups, but that's it. I guess some people see EY as some kind of leader, but even if he were one, that's not an organization. My first idea is to create some kind of separation of topics, reddit-like. (With or without moderators, we can change that at any point if one option does not work.)
For example, I'm fed with AI topics. When i see AI, I literally stop reading. I don't even think it's rational to force that idea so much. I understand the core of this community is in that business, but:
- One of the first lessons in finance is "don't put all the eggs in one basket". If there is something more important than AI we are fucked if no-one sees it. I guess "non-rational" people will see it (since they were not active on this forum, therefore not focused on AI) but then people of this forum lose attribute "rational" since "non-rationals" outperformed them simply by doing random stuff.
- It may stop people from visiting the forum. They may disagree, they may feel "it's not right", but be unable to formulate it in "dont put all the eggs in one basket" (my example, kind of). The remaining choice is to stop visiting the site.
So, I would STRONGLY encourage new topics, and I would like to see some kind of classification. If I want to find out about AI, I want to know where to look, and if I don't want to read about it, I want to know how to avoid it. If I want to read about self-improvement, I want to know where to find it. Who knows, after some rough classification people start to do finer ones, and discuss how to increase memory without being spammed with procrastination. I think this could help the first goal (to make existing members more rational) since it would give them some overview.
I also think this would reduce cult-ism, since it would add diversity, and loose the "meta".
4. Understatement. Anyone who worked, or read anything about work knows how important plan is. It is OBLIGATORY. Essential. (See course https://www.coursera.org/learn/work-smarter-not-harder/outline )
5. I think this is not very important to us. There are lots of people here. Many enthusiasts. However, this should be some kind of guideline to make a good plan, and to tell us how much resources to devote to each problem.
In conclusion, I understand these things are big. But growth means change. (There is some EY quote on this, I think:not every change is improvement, but every improvement is a change, correct me if I'm wrong.) Humans did not evolve this far by being better, but by socializing and cooperating. So I think we should move from herd to organization.
I'd also like to see targeted interaction and outreach to the academic research community.
GiveWell has a good model of validating and checking intuitions against prominent people in development, but seems to opt for public intellectuals over less famous experts in the field who's thinking those public intellectuals may defer to. In the EA community, I feel this has lead to such confidence in deworming, when deworming is actually one of if not the most controversial topics in academic impact evaluation (nicknamed worm wars. And DALY's are the pariah outside of specific subcommunities of impact analysis looking to the future, not immediate use.
There may be many similar misunderstandings in the rationality community which are taken for granted. But unlike the EA community, the rationalist community seems to be less transparent. MIRI technical research agenda is still secret, amongst other things..
By contrasts, I can go on GiveWell, which in some ways isn't part of the EA community so much as the inspiration for it, and see how they think they think and motivating influences cleanly laid out, without even going into their methodology. Be warmed, ordinary readers, I'm playing the critic here. MIRI is much more technically complicated that GiveWell, I'm just trying to give criticism to be constructive. Path dependence and novelty of MIRI's agenda, amongst other things, are obvious barriers to doing things the EA way in the rationalist community.
Btw, I think you've misspelled 'community'. Some members of the community seem really neurotic about that sort of thing and it would be shame if you were downvoted or missed upvotes for something as trivial as that.
Thanks for the warning. I forgot to check the title. Grammar-Nazis always lurk for that fresh non-native-speaker flesh.