Various people raised concerns that growth might ruin the culture after reading my "LessWrong could grow a lot" thread. There has been some discussion about whether endless September, a phenomenon that kills online discussion groups, is a significant threat to LessWrong and what can be done. I really care about it, so I volunteered to code a solution myself for free if needed. Luke invited debate on the subject (the debate is here) and will be sent the results of this poll and asked to make a decision. It was suggested by him in an email that I wait a little while and then post my poll (meta threads are apparently annoying to some, so we let people cool off). Here it is, preceded by a Cliff's notes summary of the concerns.
Why this is worth your consideration:
- Yvain and I checked the IQ figures in the survey against other data this time, and the good news is that it's more believable that the average LessWronger is gifted. The bad news is that LessWrong's IQ average has decreased on each survey. It can be argued that it's not decreasing by a lot or we don't have enough data, but if the data is good, LessWrong's average has lost 52% of it's giftedness since March of 2009.
- Eliezer documented the arrival of poseurs (people who superficially copycat cultural behaviors - they are reported to over-run subcultures) which he termed "Undiscriminating Skeptics".
- Efforts to grow LessWrong could trigger an overwhelming deluge of newbies.
- LessWrong registrations have been increasing fast and it's possible that growth could outstrip acculturation capacity. (Chart here)
- The Singularity Summit appears to cause a deluge of new users that may have similar effect to the September deluges of college freshman that endless September is named after. (This chart shows a spike correlated with the 2011 summit where 921 users joined that month, which is roughly equal to the total number of active users LW tends to have in a month if you go by the surveys or Vladmir's wget.)
- A Slashdot effect could result in a tsunami of new users if a publication with lots of readers like the Wall Street Journal (they used LessWrong data in this article) decides to write an article on LessWrong.
- The sequences contain a lot of the culture and are long meaning that "TLDR" may make LessWrong vulnerable to cultural disintegration. (New users may not know how detailed LW culture is or that the sequences contain so much culture. I didn't.)
- Eliezer said in August that the site was "seriously going to hell" due to trolls.
- A lot of people raised concerns.
Two Theories on How Online Cultures Die:
Overwhelming user influx.
There are too many new users to be acculturated by older members, so they form their own, larger new culture and dominate the group.
Trending toward the mean.
A group forms because people who are very different want a place to be different together. The group attracts more people that are closer to mainstream than people who are equally different because there are more mainstream people than different people. The larger group attracts people who are even less different in the original group's way for similar reasons. The original group is slowly overwhelmed by people who will never understand because they are too different.
Poll Link:
Request for Feedback:
In addition to constructive criticism, I'd also like the following:
-
Your observations of a decline or increase in quality, culture or enjoyment at LessWrong, if any.
-
Ideas to protect the culture.
-
Ideas for tracking cultural erosion.
- Ways to test the ideas to protect the culture.
Agreed. That's easier. However, sometimes the easier way is not the correct way.
In a world where the authoritative "facts" can be wrong more often than they're right, scientists often take a roughly superstitious approach to science and the educational system isn't even optimized for the purpose of educating what reason do I have to believe that any authority figure or expert or established user is more likely to be correct?
I wish I could trust other's information. I have wished that my entire life. It is frequently exhausting and damn hard to question this much of what people say. But I want to be correct, not merely pleasant, and that's life.
Eliezer intended for us to question authority. I'd have done it anyway because I started doing that ages ago. But he said in no uncertain terms that this is what he wants:
In Two More Things to Unlearn from School he warns his readers that "It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism."
In Cached Thoughts he tells you to question what HE says. "Now that you've read this blog post, the next time you hear someone unhesitatingly repeating a meme you think is silly or false, you'll think, "Cached thoughts." My belief is now there in your mind, waiting to complete the pattern. But is it true? Don't let your mind complete the pattern! Think!"
Perhaps there is a way to be more pleasant while still questioning everything. If you can think of something, I will consider it.
You might think about the reasons people have for saying the things they say. Why do people make false statements? The most common reasons probably fall under intentional deception ("lying"), indifference toward telling the truth ("bullshitting"), having been deceived by another, motivated cognition, confabulation, or mistake. As you've noticed, scientists and educators can face situations where complete integrity and honesty comes into conflict with their own career objectives, but there's no... (read more)