[Note: This is very rough but I’m looking for feedback and help on doing this estimate so I wanted to just post it quickly and see if others can help.]


I’ve been trying to estimate the theoretical upper-bounds (or lower-bounds) on the *potential* community size of LW.


I know rationalists who seriously claim that we shouldn’t spend effort trying to grow LW because over half of the people smart enough to even follow the conversation (much less add to it) are already here.  The world is pretty irrational, but I’m trying to find evidence (if it exists) that things aren’t that bad.  There are only around 6000 LW accounts and only 200-300 are active any given month.

So a trivial bound on our community is

[200, 6.88 billion]

A big filter is being able to use English online

Native English Speakers: 341,000,000
All English Speakers:  508,000,000
I found a similar number (536,000,000) for number of internet users who speak English.

7.4% ---  Speak English

However, only 15% of US + UK (majority of English speakers worldwide) are “Not religious”.  Another sad sub-fact is that 25% of people with “No religion” believe in god as well!  So really it’s only 10-11% of Americans and British who are potential LWers.  My guess is that if someone can’t get this right, they really need to go through Dawkins before they can realistically contribute to or benefit from our community.  I’m sure there’s some debate on this, but it seems like a pretty good heuristic while not being iron-clad.

0.81%

"Intelligence and the Wealth and Poverty of Nations" says that the US and the UK have avg IQs of 98 and 100 respectively.

And although you’d think being an atheist would be a big screen that would filter highly for IQ, it only accounts for about 4 extra IQ points versus the average.

So if we assume a base-line IQ of 103 among atheist from the US and UK (who speak English), the proportion of them with an IQ of at least 130+ is only 3.6%


0.0293%


So in we clumsily extrapolate the US+UK demos across all English speaking online people world-wide, maybe we have 2 million possible readers left in our target audience?


Google Analytics claims Less Wrong has had a total of 1,090,339 “Absolute Unique Visitors” since LW started.  That’s almost certainly an over-estimate -- although it’s not just unique visitors which is over 2 million.  Hmm... if we assumed that was correct and 1 million people have come to LW at some point and only 6000 stuck around long enough to sign up, perhaps we did already have 1/2 our target audience or so arrive and leave already?  I dunno.  What do you think?


I think this analysis of mine is pretty weak.  Especially the numerical estimates and my methodology.  I’m trying to use conditional probabilities but having trouble separating things.


I’d welcome help from anyone who can find better statistics or do a totally different analysis to get our potential audience size.  Is it 10 million? 10,000? Somewhere in between?

Some other screening characteristics I’ve considered using are MBTI, gender (is it correct to just divide our target by 2? i haven’t chosen to do that but i think a fair case could be made for it... let me know what you think), age (although I believe my English screen takes into account removing those too young to use the net), etc

I’m looking forward to seeing some other sets of numbers that come to different estimates!

New Comment
59 comments, sorted by Click to highlight new comments since: Today at 8:47 AM

Tangent: If there is a desire to recruit more people to LW I'd like to see a concerted effort to target domain experts we don't have. You start to get diminishing returns adding more computer programmers with amateur interests in hard science.

I'm doubtful LW will ever become really good at recruiting.

I went to the page with the Recent Posts and crunched some numbers. Of 250 recent posts, 35 were on meetups, 25 were continuous threads like Quotes or Potter Commentary, 28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense, and 13 were meta about the site itself. That leaves a hundred fifty that could realistically draw new people in. Of those, 55 had <15 karma; ie we couldn't even find fifteen people on this site who were willing to say they were any good and so newbies are unlikely to like them much. That leaves 95. Of those, about 25 were about specialized math-y topics that most IQ 130+ rationalists don't even realize exist and have no background in, like decision theory. So that leaves maybe 70 posts since May that really in the running for attracting newbies. Many of these posts are in my opinion either so soft and vague as to be boring, or else filled with jargon.

Honestly, I'm surprised we're getting new people at the rate we are.

I haven't looked as closely at this as you but from what I can tell the site looks much better if we only examine promoted posts. Meta and AI posts rarely make the front page. What does fill up the front page are the meet-up announcements. I'm not sure what to do about that since promoting that aspect of the site does seem important. It wouldn't looks so bad if we were getting more front page worthy rationality/science posts.

Does anyone other than me have a bunch of post ideas they've been meaning to write but keep procrastinating on? If there are a couple of us, I have an idea.

28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense,

I'll just say it: for reasons totally unrelated to recruitment I'd like to see considerably less of these. People are pretty good about voting down the FAI ones (which are nearly always horrendously sloppy). But my sense is that, in general, posts on transhumanist/singularity topics are noticeably less rigorous and less insightful than posts on other topics. Why this is, is a rather interesting question. My guess is the lack of a foil. With topics in science, philosophy and rationality there are established authorities with empirical work to bolster our position and professional studies for us to criticize and engage with. When I did high school debate my best matches were with debaters much more talented and experienced than I because I would rise to their level. Conversely I would sink to the level of my least talented opponents. The problem with transhumanist/singularity topics is that a) people here already agree with the most talented contributors in the related fields and b) many of the other contributors in those fields are quite mediocre relative to domain experts in more mainstream fields.

I believe LW is at it's best when it is engaging and criticizing mainstream institutions and popular ways of doing things and trying to provide alternatives. This is particularly the case when the target is a representative of traditional rationality. Now of course there is a Less Wrong critique of the mainstream regarding singularity and transhuman topics it's just a rather simple and straight forward one which has been covered dozens of times. It is simple because most people, institutions and systems of thought have made zero effort to engage with transhumanism and Singulatarianism. There is no foil.

I realize of course I have no where near the standing to dictate the way Less Wrong should be. But occasional months where AI was forbidden as a topic, as it was in the very beginning, would I think, rejuvenate our discussions.

And yes, I realize I'm the guy who posted about Pet Cryonics in the discussion section.

Thanks for this analysis Yvain. I'm glad you're interested in this even if you are (rightfully) pessimistic.

I agree that the ongoing community dynamic is a terrible place to "jump into" LW. I've been wanting help from someone (any volunteers out there??) to help design a more friendly, more Wikipedia-ish homepage for LessWrong which could actually help non-hardcore LW users navigate the site in a way that makes sense. The promoted stream is such a horrible resource for that. If someone could make a mock-up of a potential LW homepage (on the LW wiki perhaps?), I could get it implemented for you if it's even halfway respectable. Nothing could be worse than what we currently have. A good homepage would probably have a few stable pointers to the important sequences, a portion of the current promoted feed, a video or other tutorial explaining what LW is all about... I dunno. Just anything that isn't an opaque listing of "Open threads!", "Cryonics!", "Torture scenarios!", "Rationality Quotes!", "Omega!", "Meetups!", "Cake!"

For what it's worth, the thing that got me visiting here regularly was the list of EY's OB posts.

I know y'all love the sequences as such, and I can understand why, but the fact remains that I was motivated to work my way through much of that material relatively systematically in a chronological format, but once I got to the end of that list -- that is, the time of the OB-to-LW migration -- I bogged down (1). The sequence/wiki style is less compelling to me than the chronological style, especially given the degree to which the posts themselves really are blog posts and not articles.

I suspect that having some sense of where the process terminates is a key aspect of that. If I don't know how long the process is going to be, it's harder to get up the energy to work through it.

Anyway, to the extent that appealing to people like me is a valuable subgoal(2), it seems what you should do is assemble a simple chronological list of LW posts from the dawn of the site that are most representative of what you'd like the site content to look like. (3)

== (1) To be fair, that was also the start of the Fun Theory sequence, which I am finding less compelling than some of its predecessors, so the content may bear some of the responsibility for my bogged-down-ness... but not all of it, nor even (I think) most of it.

(2) Which isn't intended as false modesty; I just mean there's no particular reason to believe that what appeals to me will appeal to anyone else.

(3) It may be sufficient to take the highest-voted tier of posts for each month, say, and string them together chronologically... though you'd probably want to backfill other posts that they depend on.

This is one of the few places on the net where cryonics related topics are discussed in a coherent manner, particularly by people in my age bracket. And I like that while it's cryonics friendly, it's not about cryonics. It's about the (supposed) rational benefit of cryonics. Not just any cryonics related topic gets discussed, and not just in any way.

I can see how the community might grow to a point where the different topics cannot all be discussed in the same place. The current division between discussion and main is already helpful for distinguishing formal versus informal discussions. Perhaps sub-groups for cryonics, decision theory, existential risk, etc. are good ideas. But separating them out runs the risk of creating trivial inconvenience to cross-pollination of ideas.

Yes. Who is promoting these things, and why? What is their rationale?

An updated promotion policy that keeps meetup threads, discussion threads and obscure singularity-heavy (and other nominally off-topic) threads from being promoted could improve the situation.

A quick glance at the front page shows that 9 of the articles currently sitting on it are for meetups, 7 of which are over. I really think meet ups at the very least should be taken off the front page. I'd also favour at least half the horizontal width being used for pointers to places to dive into the sequences, and potentially even a featured article of the week/month.

28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense

This. This is the thing that has to be fixed before LessWrong can claim to primarily be about "refining the art of human rationality."

AI is interesting, but not about rationality. Cryonics is interesting, but not about rationality. Nanotechnology is interesting, but not about rationality.

Ways of thinking are about rationality. Ways of being smart are about rationality. Ways of being stupid are about rationality. Stories of failing spectacularly are about rationality.

The first group may of course be highly relevant to the second. But it's not about it, and requiring readers of a site advertised as being on the topic of "rationality" to buy into the standard transhumanist belief cluster is a failure of signaling rationality, and thus for these purposes a failure of instrumental rationality.

Before posting, don't just think "Is this interesting to the LW readership?" but also "and is it on topic? Is my actual point about rationality?"

Honestly, I'm surprised we're getting new people at the rate we are.

It's an interesting site full of really smart people, and (and this is a real plus point) the comment quality is consistently high because people buy into the moderation system, i.e. "mod up if you want to see more comments like this." That's a BIG WIN. Keeps me reading.

Really. I strongly suggest you just write a main section post requesting people stay on-topic in posts, and that not clearly being on-topic about rationality is a reason to downvote. See what the community thinks.

I felt a bit out of place until I started reading MoR; what was all this cryonics/decision theory stuff?

A couple chapters in I thought, "THAT'S the kind of stuff I'm interested in talking about! Now I feel like I'm in the right place."

I agree that generally most of the newer content we have is less interesting than the older, but maybe that's because the older already covers a huge amount of LW's subject base pretty well. The good thing is, the old content is all still here and good for attracting new readers, if new readers are what we care about. That said, they could be better promoted, easier to find, navigate and search.

Of course, speaking selfishly I'd really like it if we had new content that was as solid and useful as Eliezer's work, and some of the other 'classic' content. Perhaps we need more LW people from different backgrounds than we currently cover?

[-][anonymous]13y30

I think the logical next step is to translate the sequences into a book or several books of some kind. I vaguely remember EY talking about something about writting a book on rationality, but I could be mistaken. This might make LW very good at recruiting new members compared to now it the book is successful and well read for years to come.

Everyone who's actually read the sequences - and that's a lot of reading, too much for almost any casual reader - should try summarising and condensing them themselves for their own circle of net-friends. That's how to get a meme out there. You'll be a hell of a lot more convincing to people you know if you're saying something than if you point to someone else saying something.

[-][anonymous]13y10

Eliezer is writing the Sequences into a rationality book, but I agree with David Gerard's suggestion that LW readers should try summarizing his ideas in their own words. This could be an important topic for discussion here on LW: making sure that some of the core ideas discussed on this site aren't lost in translation.

Yes, that's being done.

Honestly, whenever I read through Omega-related posts, I feel like we might be trying to re-invent Calvinist predestination theology. These sort of paradoxes of free will have been hashed out in monastaries and seminaries for almost two millenia, by intelligent people who were as rational as their understanding of the universe allowed. I wonder if even a theology student has something to contribute here.

Hey, you're relatively new. How did you end up here?

Someone linked to the paperclip maximizer wiki page in a post on reddit.

It's an interesting idea, but we might need more emphasis on becoming more rational to improve your life (rather than on FAI) to attract them.

This relates to something else I was thinking about-- at this point, LW is a pretty small social group. If there were 100,000 active members, which I think is theoretically possible, then LW would develop substructures of some sort.

Less Wrong doesn't seem to be on Face Book. Should it be?

Less Wrong on Facebook

It's an interesting idea, but we might need more emphasis on becoming more rational to improve your life (rather than on FAI) to attract them.

Agreed. Or perhaps rationality as a way of improving science?

This relates to something else I was thinking about-- at this point, LW is a pretty small social group. If there were 100,000 active members, which I think is theoretically possible, then LW would develop substructures of some sort.

Thats a terrifyingly large number to think about. Do you think 100,000 is active while maintaining the current median IQ? Or without the signal to noise ratio dropping significantly?

The structure of the site would have to change significantly. We'd have to live with more specialization, no one would be able to follow all of that content.

Let's run some casual analysis on intelligence level and LW.

My impression is that my intelligence level is between 1 in 1000 and 1 in 10,000. It isn't hard for me to find compatible people in science fiction fandom. I think I'd resist arguments that it's below 1 in 500 unless there was very good evidence.

I fit in well here, but am not of the top rank, partly due to lack of math and partly due to lack of ambition. It's possible that even if I had more desire to write top level posts, I still wouldn't be as good at it as Alicorn.

I don't bother trying to follow the strategic maneuvering in MOR (I suspect I'm not alone in this), and appreciated the bit recently where Harry lost track. Sorry-- don't remember the details.

I haven't found a lot of blogs where I'd say that the intellectual level is as high as LW, but I can recommend Making Light. The posters are very high on the verbal skills side and have respect for rational argument.

So, let's start with 500 million minimally potential LW users. One in a thousand of them would be 500,000.

1 in 5 of them showing up seems like a high proportion, but not insanely so.

Maybe LW could top out at 50,000 instead of 100,000 without changing its character.

Demographics point towards increasing numbers of potential LW posters, if only because the proportion of people who use the web recreationally are going up. I'm not sure whether the Flynn effect is relevant.

There are certainly 100,000 people as smart as the current community who might one day read Less Wrong. What seems unlikely is that we can expand without lowering barriers to entry. Lowering these barriers, it seems to me, would make it very difficult to restrict our recruitment to those who are in that 100,000.

It's complicated-- I'd say that for current members to keep promoting it in their social circles will tend to maintain quality. And MOR Is helpful, too.

Just expanding for its own sake wouldn't be a good idea.

Figuring out a more efficient way of teaching rationality than the sequences (I hope Eliezer's book will be that) would be a very good thing.

[-][anonymous]13y00

I don't think the Flynn effect is relevant for this analysis.

Speed of development (in places like India today and Eastern Europe in the early 00's), spread of English as a second language and the changing cultural standards are I think are much more important reason for the rise in number of English speaking people who use the web recreationally. Improvements in environment don't effect adult IQ much, as the nation develops and English spreads it stands to reason a previously hidden reservoir of potential users is a greater increase in numbers in absolute terms than the Flynn effect could hope to gather via children growing up with slightly higher IQs especially since many developing nations may be developing precisely because they are using up their demographic dividend.

The Flynn effect also clearly can't continue forever, there are signs that its ending or has perhaps already ended in the developed world (I think Denmark has shown a stall in the Flynn effect for more than half a decade, there may be other examples but I don't recall studies right now).

A big filter is being able to use English online

I would really like to see versions of LW in other languages. Admittedly, because English is currently the most prestigious language on Earth, intelligent ambitious people tend to learn it. Still, I think lowering barriers -- including linguistic ones -- is usually worth doing if possible.

I actually enjoy learning and using languages other than English, and would be willing to participate in a project to translate the Sequences into (at least) the major European languages.

As SIAI volunteer coordinator, I would love to help you do translation of the sequences to other languages!

I'm currently supervising the translation of HP:MoR to Chinese and Korean with some positive feedback from readers so far. http://www.fanfiction.net/u/2558248/WingChen

And on the other end, we could use more translations of "Artificial Intelligence as a Positive and Negative Factor in Global Risk" on http://singinst.org/research/publications .. currently only in English and Portuguese (took the translator 3 weeks to complete). I have my Harry Potter translators taking a break to translate this doc into Chinese and Korean and also other volunteers working on Russian and Spanish. But any other languages would be amazing.

But yeah, translating HP: MoR is motivated by the hope that it will draw more folks to LessWrong (or at least prepare them to think rationally). And LessWrong is a potential feeder for SIAI although admittedly a valuable resource in it's own right as well which I respect as being different and separate. But that said, translating the core rationality documents would be great. Even people who can read English often tell me that they enjoy being able to "relax" and read something in their native language. Usually they can read things much quicker and as you point out, trivial inconveniences are a killer! Also, most people at least START OUT reading LW partially because they find it enjoyable or entertaining and only later fully take on wanting to improve and get stronger... so it least having some of the beginning articles of the sequences in other languages would be worlds better. Drop me an email with your plans for helping with this when you get a chance.

I'm very glad to hear that work is being done on translating the core documents.

I could translate something to German, which is my mother tongue. Which documents have the highest priority? And do you think translations are really worthwhile? I mean, everybody should just speak English or shut up, to put it bluntly. ( Yes, I don't like my country....) And, well in the next two months I can't promise anything, but maybe in the summer-holidays.

Can you get rid of some of the white space in this post? Maybe make those sources hyperlinks instead of raw urls?

[-][anonymous]13y20

I second that.

I know rationalists who seriously claim that we shouldn’t spend effort trying to grow LW because over half of the people smart enough to even follow the conversation (much less add to it) are already here.

That would mean LW posters are 200 out of 500 million. That is, about 99.9999th percentile. That is, about IQ 172. Also, everyone in the Mega Society hangs out on LW.

Yeah, I don't think so.

[-][anonymous]13y60

A more plausible argument would be that there are other preconditions such as values, spare time, peculiarities in processing social signalling, ect. that are in conjunction with an appropriately high IQ met by a group on the same scale of magnitude as LW.

Also getting the real number of people with nearly any IQ level is tricky. The global IQ distribution is badly non-Gaussian due to many factors.

Thanks for starting this.

Some corrections I would make on the analytics numbers:

If analytics says 1 million visitors, I would halve that and still feel I'm overestimating. Remember, analytics can only count unique machines, not humans. I personally use a desktop, a laptop and a mobile phone to access LW, the phone has replaced another one I have used in the past for the same reason. Also, my computer has been through several formats. And that's not counting the many browsers I've been through or the occasional other person's computer I've used to browse LW. Overall I'd say I represent about 10-20 of these 'absolute unique visitors'. I may be an outlier, but given that this site attracts mostly technically savvy folk, I shouldn't be that extreme.

Also, your 2 million estimate is based on people with 130+ IQs, but you can't say the same of the analytics visitors. In fact, given the rarity of 130+ IQs, I would be extremely surprised if they represented anything more than 30% of total visitors ever.

[-][anonymous]13y60

To increase the awesomeness of LW we'll probably want to figure out what makes some individual LWers more awesome than others. In particular, what are the differences between the top contributors and the average LWer? Are there methods to systematically bridge these differences that can make better rationalists? I would be curious to hear what the top contributors think about why they've been able to participate more effectively (which, AFAIK, hasn't been collected in one place). We could do this through a discussion post, and perhaps write a post summarizing the results if they're interesting enough.

[-][anonymous]13y30

BTW I think LW would be more awesome with more pirates.

I'm sure we have all downloaded an album at one point or another...

[-][anonymous]13y20

I'm not sure if this kind of debate is precisely what Louie had in mind, my interpretation differs.

Awesomeness of LW has been discussed in many ways. Recall the "lets do something with our rationality to become stronger!" debates? The "are there enough arbitrarily designated minorities" debate? The "what is LW actually about" debate? The "do you come here for fun too?" Discussions about poster expectations. Top posts about pacifism. ect.

The very fact that you are posting here in a topic in the discussion section is partially a result of the debate about post quality and the role of keeping our garden free of weeds.

I think the idea of getting some numbers on how many LW as it is right now can hope to get is a good first step in getting quantitative estimates about how to raise the sanity waterline, which isn't the exact same thing as increased awesomeness.

what makes some individual LWers more awesome than others

Writing skill is a big one.

[-][anonymous]13y20

Actually, I think it's the karma system that really incentives writing skill. I don't focus as much on the quality of my writing on other sites compared to LW, since I know that the LW community expects comments to be clear and concise.

You need to be more specific than "rationalists" about the target group.

Who should be reading LessWrong?

  • everyone in the world?
  • those who do or might self-identify as "rationalist" if given the opportunity?
  • the typical moderately bright? <-- I'm around here
  • those of searing IQ only, thanks?

In that case, the question is: "what's it for?"

Who is this aimed at? Who does this need to be useful for?

I understand the blog was founded to recruit more rationalists who would then advance the aims of SIAI. What's the program in detail? Is it in any way organised, or do you just sort of hope people will drift in that direction? To what extent is the stated mission ("refining the art of human rationality") not necessarily aligned with that?

I wrote the above off the top of my head. I haven't found a list of questions like this anywhere on the site as yet. If there isn't one, and the above is actually the best available, that would be most disappointing ...

There has been discussion of this previously, but very little. The comments on the mathoverflow post indicate that there isn't much consensus about the target audience. I haven't reread them all very carefully though, perhaps there is some more tangible result there that I missed.

I've also talked with at least one person about the subject at the meetups and we disagreed and didn't really resolve our disagreement.

So I agree with you that the question of "who should be on Less Wrong" is an open question, and it is a question which really should be addressed.

[-][anonymous]13y50

Many many things wrong, but since any of my early attempts at arriving at such an estimate would be just as flawed I applaud you for having the courage to post this and I think we can take slow steps to thinking about this more clearly and having better estimates.

So we are basically searching for the crude number of people who would join the LW community as it is right now if they where exposed to it? Is that a good approximation of what you mean is by " upper bound of community size"?

We need to first establish how likely people in various IQ ranges are to be able to understand the concepts of LW: This is necessarily guestimation, but it almost deserves its own debate. We can't just simply say minimum IQ is 120. I think its more accurate way would be to check out literature on what IQ ranges tell us about the likelihood academic success and being good at math (in more quantitative ways that just saying "It helps").

I would say that being is a atheist is almost a precondition but not quite. There have been active posters who have identified as theist, I recommend you check out the threads about LW demographics. Considering the huge number of Theists in the world the number of nominally religious LW posters learing to be less wrong may indeed be nontrivial. Also don't forget the strong Buddhist subculture that exists here (in mode of thought if in not outright religiosity), however we can't just simply bunch Buddhists together with Atheists especially not traditional Buddhists.

Also I don't think sex is that important. Sure there are many many more men on LW than women. But of that population women are disproportionate active (at least among the top contributors). I think a better approach than directly factoring in sex is figuring out conformity estimates and also perhaps education in the hard sciences.

We need to first establish how likely people in various IQ ranges are to be able to understand the concepts of LW: This is necessarily guestimate but it almost deserves its own debate. We can't just simply say minimum IQ is 120. I think its more accurate way would be to check out literature on what IQ ranges tell us about the likelihood academic success and being good at math (in more quantitative ways that just saying "It helps").

My suspicion is that for a lot of LessWrongers have weirder brains than raw IQ will capture. This is part of why we tend to be underachieving geniuses. I took WAIS III when I was 17 and got a 150 verbal IQ and a 120 performance IQ. That averages to a 135- but that number isn't actually a very helpful predictor of future performance because the two scores are so different. This kind of thing gets labeled a learning disability, they didn't even bother writing down the total on my test results. I suspect a lot of people here also have weird brains with strengths and weaknesses not accurately conveyed by raw IQ.

ETA: Which isn't to say I have a better way of estimating a potential user base.

[-][anonymous]13y20

I'm quite sure that the average LW brain is weird and agree on the point. I considered proposing rates of highly functional nonneurotypicals to be added as a group aprticularly likley to end up here.

However I hope that you see why I think IQ estimates are very relevant since there are certain concepts necessary here that become very hard to grasp for those with lower IQs.

It seems like previous exposure to relevant material, including but not limited to the math parts of a college education, would be a much more direct benchmark.

But several of our more productive posters/commenters did little to no college math. That might be a weird exception for philosophers, though.

I'm also not sure college math/science is a sufficiently narrowing criterion.

Well, yes. That was an example; the point I intended and may not have been clear about was that specific content knowledge might be a more accurate way to narrow the set than a quantified measure of general intelligence. There are probably tons of extremely smart people who have never been exposed to the subjects which would make them productive LW contributors.

So we are basically searching for the crude number of people who would join the LW community as it is right now if they where exposed to it?

Yes, that's a much better way to put it. Although I clarify my intended goal a bit more below.

I recommend you check out the threads about LW demographics

My analysis has been heavily influenced by data from Yvain's survey... especially where it matches up with intuition. Atheism (or at least agnosticism) seems to be one of the strongest defining traits of this community (shared by 93.3%). We would expect that.

Perhaps the "crude number of people" I'm looking for is the number of people who would enjoy devoting their time to reading the sequences (for purposes other than to troll them) if they were introduced to them in the right way. I'm assuming that to enjoy the sequences, people would need to be smart enough to mostly understand them and at least be predisposed to caring about having correct beliefs... which unfortunately disqualifies almost everyone. They would also need some free time... and they would have to want to spend that free time reading... which probably disqualifies almost everyone else and reduces the target audience to around the current size of LW. :/ (As an aside, part of my working theory of why this community is so akrasia-filled is that the only people who have the time to read and digest something as long as the LW sequences are people with motivation systems so severely crippled that they prevent their owners from filling their time with things that most people without akrasia have like steady jobs, lovers, and rewarding social interactions. Think about it. Most Americans or other English speakers who have tendencies towards basic rationality (read: our entire target audience) and a somewhat functional motivation system in this world win so hard at life that they quickly become wayyyy too busy to allocate any of their precious time to boring, anti-social & low-prestige* tasks like reading online forums.)

Anyway, my personal theories on LW aside, my goal here is to build more rationalists and stronger rationalists. I assume a number of those people who read the sequences would be "involved in the LW community as it is right now" as well but that's sort of a secondary consideration in my mind. As I've pointed out before, 96% of LW participants are lurkers who only read. So my main goal is to expose interested parties to the sequences and it is my expectation that 3-4% of those folks will naturally stick around and become more involved after that.

the number of nominally religious LW posters learing to be less wrong may indeed be nontrivial

I guess strong to moderately strong theists aren't really in my imagined Less Wrong target audience. Despite a few extraordinary counter-examples, LessWrong isn't really equipped or devoted to the matter of personal de-conversion. And I don't think it's going too far to suggest that being an atheist is pretty much a pre-req to being rational. It's the canonical example of rationality for a reason.

Age is a screening factor-- I expect there are very few LWers under 15, and probably not many under 18.

There's probable an age above which people are significantly less likely to be in LW, but I'm not sure what it is.

There are a lot of people who would otherwise be likely to enjoy LW, but who are too busy. And (probably a smaller number) who are too ill.

Yes, these are all excellent screens to look into. Do you know any sources for these statistics?

No, though the age information should be relatively easy to find.

Too busy or too sick would be much harder.

A while ago, there were complaints from people who said their web connections were too slow, and wanted email versions of LW. I don't know how many people are affected by that barrier.

I don't know if the "too busy" thing actually would be that hard to estimate. Perhaps the US Time Use Surveys could illuminate things a bit? I've always (half-jokingly?) thought that a pre-requisite to becoming a serious lesswrong participant is being in a relatively unstructured period of your life (ie, unemployed, college student, semi-retired, etc) or having a completely broken motivation system which keeps you in a perpetually unstructured life against your will (akrasia).

I imagine that most of our potential target audience (those who would enjoy and learn things of value from the sequences) are already enough above the current sanity waterline of greater society to be so ridiculously successful (by narrow, American-style standards of success) that they no longer devote any significant portion of their time to reading recreationally... kind of like how knowing a bit about biases can hurt you and make you even more biased... being a bit rational can skyrocket you to a high level of narrowly defined American-style "success" where you become a constantly-busy, middle-class wage-slave who zaps away all your free time in exchange for a mortgage and a car payment. Nice job buddy. Thanks for increasing the GDP epsilon%.

Then again, most people, no matter how much time they were given, aren't going to read online intellectual non-fiction as a hobby. It just would never occur to most people as a possible idea.

Long-term online community management is a hard problem. This discussion seems to mostly assume that LW would stay at pretty much the ideal state as long as it had an influx of smart people. I see at least two problems here. One is that LW doesn't have a very clear outside context to specify what's on topic and what isn't, such as a forum focusing on, say, military history, knitting or recent publications in theoretical mathematics would have. LW is about a very open-ended problem, and approaches to it are mostly specified by what goes on in the comments. This makes LW vulnerable to community drift, where low-quality discussion starts setting the tone of the site, and drives out quality posters. New posters don't just see "a discussion site about rationality", they see a site with a specific culture and specific approaches, and if these look crap, the non-crap posters will look elsewhere.

The other thing is scaling issues. The site dynamics will change as it grows, and a site that's usable at one scale of users is not necessarily usable with another scale. As an example, it's currently feasible to follow conversations through the "Recent Comments" sidebar. Should the comment traffic grow by an order of magnitude or two, this would become a lot more work. Managing threads, keeping crap in check and getting to know active users would become much more work at a larger scale.

Successful long-term forums seem to have moderators who have specific ideas on how to keep the community healthy and who work actively towards that goal. I don't have really good references on this, but this long post from kuro5hin has good stuff on how online communities thrive or fail, and Paul Graham's post on managing Hacker News is interesting.

Also, IQ focused community management doesn't have a terribly successful history.

Your use of the belief in God as an estimate is not a good one. If that's the case then you are a) not going to pick up people who might learn to become more rational and b) throwing out one specific irrationality for reasons that aren't clear as compared to other irrational behavior. We've had multiple productive LWians who have some sort of belief in God.

[-][anonymous]13y00

Seconded. Eliezer was aware of this a while ago.

Your analysis assumes that English ability is independent of intelligence. That will give a big underestimate of the potential audience size. I would guess that these days probably 50% of all humans with >130 IQ can understand English well enough to participate in LW.