I would like to explore your belief, Louie, that increasing the rate of growth of LW is desirable.
ADDED. Clarification: interventions to increase the rate at which new people join LW are good IMHO when the current rate is low enough not to overwhelm the "mechanisms for newbie assimilation". So I guess I want to ask Louie if he is measuring the rate of new arrivals and whether he plans to discontinue the adwords if that rate gets too high.
ADDED. Last time there was a meetup at Benton house, I told Kevin that I wish he had waited with his efforts to apply "search-engine optimization" techniques (to increase participation on LW) till LW had more time to assimilate the influx from Methods of Rationality. I hope I am not misrepresenting Kevin, but I got the distinct impression from his reply in that short conversation that he recognized no good reason for LW not to grow as fast as possible. (Of course, Louie is not Kevin.)
I certainly do not object to the goal of building more rationalists (as you put it) but I worry that too fast a rate of growth might drastically decrease the usefulness of LW as a place to learn. And I worry that it would cost a lot more to reverse that decline that it would to prevent it.
Do you perceive that LW's usefulness as a place to teach and to learn has decreased since its founding?
Do you believe that as LW grows it will become a less useful resource for learning for people who are already very strong rationalists? I would not mind if that proves to be the case BTW if the growth helps larger numbers of less-strong rationalists: I just want to know what the objectives are of those making significant efforts to improve LW.
I do not think that anyone will dispute that most online conversations (forums, group blogs, boards, etc) lose their usefulness after a few years in a way that is difficult enough to reverse that one might as well just start over with new software, new rules or new owners. Paul Graham maintains that the conversation he runs (Hacker News) is constantly on the verge of failing and that only constant interventions by Graham and his helpers prevents that from happening. In particular, Graham considers growing too fast to be a potent risk to Hacker News.
To what do you attribute the uncommonly high usefulness of the LW up to now relative to other place to learn online?
Which practices (deliberate or accidental) do you believe have maintained or will maintain the usefulness of LW as a place to learn and to teach?
What signs (if any) would cause you to come to believe that LW's capacity to assimilate new voters and new contributors has been pushed to its limits and that attempts to grow LW should be suspended until the capacity has recovered?
What fraction of LW comment do you personally "vote on" (make a decision as to whether to upvote, downvote or leave alone)? ADDED. Reason I ask this is that voting on many comments was much harder work than I thought it would be before I tried it. (The hard work came from the need to avoid or recover from an overly judgmental and critical mood that would have had a deleterious effect on my non-LW life.) Consequently, unless my experience with voting was atypical, anyone contemplating an action that would add significantly to the work done by LW's voters should probably have significant experience on voting on a signficant fraction of LW's comments unless he or she has some other way to avoid overextending that particular resource.
ADDED. The tone of this comment is probably too strident. It is important to keep in mind that since it is relatively inexpensive to start a new group blog, forum or other "place to teach and learn" on the internet, the "loss" of LW is not all that expensive. Nevertheless, I am curious as to why I seem to be deep in the minority in certain perceptions:
The way LW has been since its founding is that anyone who registers can start voting, and I am aware of no plan or proposal to change that.
Perception One that I have that does not seem to be shared: it is easy to design a system better than what I just described. E.g., require new registrants to accumulate 20 karma before they can vote.
Perception Two: there are some simple numbers that could be collected by software in a straightforward way that are not being collected such that if one of the numbers took a sufficiently steep downturn, that would be a strong sign that any efforts (like sending readers of Methods of Rationality here, asking people to promote LW on StumbleUpon or buying adwords for LW) to increase the rate of growth of LW should be curtailed for the time being. One of those numbers is the rate of voting by accounts older than 2 years (those accounts being more likely to be able to vote in such a way as to keep up the standards of LW) as a fraction of total votes (or as a fraction of the votes made by those accounts during the same time frame last year). Another of those numbers is the number of comments made or number of words posted by accounts older than 2 years. In other words, it seems like a basic precaution to me to make sure that the new voters and the new writers on LW are not driving away the old ones.
Again, I am not opposed to trading off average comment quality for increased educational reach if that is the direction those who do the work to maintain LW want to take, but there's a difference between consciously making that trade-off and losing LW while neglecting to use the loss to collect any data useful for improving the robustness of future online conversations.
Perception Three: Since a person who has figured out how to use one web site will have very little trouble figuring out how to use another one, the higher the quality of a conversation on the web which anyone can join, the more difficult it will be to prevent that quality from regressing to the mean quality of conversations on the web that anyone can join -- and a conversation has to be far above the mean quality for it to start to compete with a competently-chosen college or textbooks as a way for young aspiring rationalists to learn.
Perception Four: Even though LW is vastly better as a place to teach and to learn and for people to cooperation on projects to improve the world than the average place on the internet, it is possible to imagine places on the internet that are much better than LW. Effort put into understanding how to preserve or maintain the quality of the conversation on LW can probably be applied to making it even better. I.e., the optimistic outcome of work on "online conversation quality" has very high utility. I.e., this is not just about avoiding the need to find another place to teach and to learn because LW has become useless for that purpose.
Specifically, there are probably tricks we can learn to improve online group rationality without needing to improve the rationality of the group's members. By "improving group rationality" I am imagining mostly (1) making teaching and learning online less irritating, less addictive and more time-efficient and (2) making world-improving projects more effective without increasing the individual effectiveness or rationality of the teachers, learners or project members. What people like Paul Graham, Clay Shirky and Teresa Nielsen Hayden know about maintaining online conversations is quite relevant here IMHO.
Thanks for thinking about these things. Very useful comments and questions.
Having a karma requirement for moderating comments seems like a really, really good idea. I'm gonna sleep on that and see if I can think of any downsides. At the very least, we should probably have a requirement that you must have positive karma (which I don't think is currently required?). I wonder how much moderation is done by people with less than a certain level of karma? It's possible significant amounts of moderation is being done by people with zero or negative karma wh...
So I'm trying to build more rationalists. To do this, I've invested a few hundred dollars of my own money to promote Less Wrong by buying low-cost AdWords on Google for different LW pages. I want to reach smart people with a really good article from Less Wrong that answers their question and draws them into our community so that the site's content can help improve their rationality. Based on buying AdWords before, I'd estimate that only 0.5-1% of people who click through to Less Wrong will actually get involved after reading an article, but since clicks only cost ~$0.04, that means it only costs me ~$6 to build a new rationalist and drastically improve someone's life. Seems like an excellent return on investment.
But to get a strong 1% conversion rate and really make an impact, I need to identify REALLY EXCELLENT Less Wrong content. Right now I'm experimenting by buying a lot of keywords related to quantum mechanics and sending people to http://lesswrong.com/lw/r8/and_the_winner_is_manyworlds/
My hope is that this page is useful and memorable enough that some small % of readers stick around and click through to other pages. My guess is that this isn't the ideal page to do this with but it's aiming in the right direction.
What page would you would want a new Less Wrong reader to find first? What answers a specific question they might have in such an impressive way that they would want to learn more about our community (perhaps many different pages for many different questions)?? Which articles are most memorable? Just looking at "Top" didn't yield any obvious choices... I felt like most of those articles were too META-META-META ... you'd need too much back knowledge for many of them. An ideally article would be more or less "stand-alone" so that any relatively intelligent person who doesn't have the whole LW corpus in their head already could just jump in and understand it immediately... and then branch out and explore LW from there.
So what do you think? Give me links to any landing pages you think would be worth promoting this way. You can write rough mini-ads as suggestions too if you'd like to be even more helpful. I'm looking forward to hearing your suggestions!