Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Poll - Is endless September a threat to LW and what should be done?

4 Post author: Epiphany 08 December 2012 11:42PM

Various people raised concerns that growth might ruin the culture after reading my "LessWrong could grow a lot" thread.  There has been some discussion about whether endless September, a phenomenon that kills online discussion groups, is a significant threat to LessWrong and what can be done.  I really care about it, so I volunteered to code a solution myself for free if needed.  Luke invited debate on the subject (the debate is here) and will be sent the results of this poll and asked to make a decision.  It was suggested by him in an email that I wait a little while and then post my poll (meta threads are apparently annoying to some, so we let people cool off).  Here it is, preceded by a Cliff's notes summary of the concerns.


Why this is worth your consideration:

 - Yvain and I checked the IQ figures in the survey against other data this time, and the good news is that it's more believable that the average LessWronger is gifted.  The bad news is that LessWrong's IQ average has decreased on each survey.  It can be argued that it's not decreasing by a lot or we don't have enough data, but if the data is good, LessWrong's average has lost 52% of it's giftedness since March of 2009.

 - Eliezer documented the arrival of poseurs (people who superficially copycat cultural behaviors - they are reported to over-run subcultures) which he termed "Undiscriminating Skeptics".

 - Efforts to grow LessWrong could trigger an overwhelming deluge of newbies.

 - LessWrong registrations have been increasing fast and it's possible that growth could outstrip acculturation capacity. (Chart here)

 - The Singularity Summit appears to cause a deluge of new users that may have similar effect to the September deluges of college freshman that endless September is named after.  (This chart shows a spike correlated with the 2011 summit where 921 users joined that month, which is roughly equal to the total number of active users LW tends to have in a month if you go by the surveys or Vladmir's wget.)

 - A Slashdot effect could result in a tsunami of new users if a publication with lots of readers like the Wall Street Journal (they used LessWrong data in this article) decides to write an article on LessWrong.

 - The sequences contain a lot of the culture and are long meaning that "TLDR" may make LessWrong vulnerable to cultural disintegration.  (New users may not know how detailed LW culture is or that the sequences contain so much culture.  I didn't.)

 - Eliezer said in August that the site was "seriously going to hell" due to trolls.

 - A lot of people raised concerns.

 

Two Theories on How Online Cultures Die:


  Overwhelming user influx.
  There are too many new users to be acculturated by older members, so they form their own, larger new culture and dominate the group.

  Trending toward the mean. 
  A group forms because people who are very different want a place to be different together.  The group attracts more people that are closer to mainstream than people who are equally different because there are more mainstream people than different people.  The larger group attracts people who are even less different in the original group's way for similar reasons.  The original group is slowly overwhelmed by people who will never understand because they are too different.

 

Poll Link:

Endless September Poll.


Request for Feedback:

In addition to constructive criticism, I'd also like the following:

  • Your observations of a decline or increase in quality, culture or enjoyment at LessWrong, if any.

  • Ideas to protect the culture.

  • Ideas for tracking cultural erosion.

  • Ways to test the ideas to protect the culture.

 

Comments (259)

Comment author: pleeppleep 09 December 2012 02:47:05PM *  18 points [-]

I'm disappointed in some of you. Am I the only person who prefers feeling elitist and hipstery to spreading rationality?

In all seriousness, though, I don't see why this is getting down voted. Eternal September probably isn't our biggest issue, but the massive increase in users is likely to cause problems, and those problems should be addressed. I personally don't like the idea of answering the horde of newbies with restrictions based on seniority or karma. That's not really fair and can select for poster who have used up their best ideas while shutting out new viewpoints. I much prefer the calls for restrictions based on merit and understanding, like the rationality quiz proposed below, or attempts to enlighten new users or even older users who have forgotten some of the better memes here. I also like the idea of a moderator of some kind, but my anti-authoritarian tendencies make me wary of allotting that person too much power as they are assuredly biased and will have a severely limited ability to control all the content here, which will generate unfairness and turn some people off.

I doubt that endless September is the main problem here, but I think it's pretty clear that this site just isn't as useful, or more importantly, fun, as it used to be. I notice that I come here less and less every day, and that more and more opinions that should be presented in discussions just aren't.

I think we need to fix that. I maintain that Lesswrong is the best thing to ever happen to me, and I want it to keep happening to other people. We need a more general assessment of the problem and ways to solve it. I honestly do miss some of the (admittedly somewhat elitist) optimism that used to flood this site.

We're rationalists. We aimed to build gods, eradicate the plagues of the human mind, and beat death itself. We said we'd win a staring contest with the empty, uncaring abyss of reality. We sought to rewrite human knowledge; to decide what, over the past 8 thousand years, was useful, and what wasn't.

If we can't keep one little community productive, we might as well hang up our hats and let the world get turned into paper clips, cause we've shown there's not much we can do to about it one way or the other.

Comment author: Epiphany 09 December 2012 08:09:05PM *  0 points [-]

Eternal September probably isn't our biggest issue

What would that be in your opinion?

the massive increase in users is likely to cause problems, and those problems should be addressed.

Thank you pleeppleep for bringing this up. I am especially curious about why this thread has been hovering between -3 and 1 karma when the majority of people are concerned about this, and have chosen a solution for at least one problem. If you get any theories, please let me know.

more and more opinions that should be presented in discussions just aren't.

People have theorized that the users who might post these discussions are too intimidated to post. Do you have additional theories, or do you think this is the problem, too?

I maintain that Lesswrong is the best thing to ever happen to me

It is one of the best things that's happened to me, too. I feel strongly about protecting it.

We need a more general assessment of the problem and ways to solve it.

How would you describe the problem? What would you suggest for ways to assess it?

I honestly do miss some of the (admittedly somewhat elitist) optimism that used to flood this site.

What do you mean by that exactly? (No, I will not bite your head off about elitism. I have strong feelings about specific the type of elitism that means abusing others with the excuse that one is "better than" them, but I am curious to hear about any and all other varieties.)

We're rationalists. We aimed to build gods, eradicate the plagues of the human mind, and beat death itself. We said we'd win a staring contest with the empty, uncaring abyss of reality. We sought to rewrite human knowledge. To decide what, over the past 8 thousand years, was useful, and what wasn't.

Wow. That's inspirational. Okay, I think I know what "elitist optimism" means now. I don't agree with the goal of building gods (an awesome idea but super dangerous), but I want to quote this in places. I will need to find places to quote it in.

If we can't keep one little community productive, we might as well hang up our hats and let the world get turned into paper clips, cause we've shown there's not much we can do to about it one way or the other.

Upvote. (:

Comment author: pleeppleep 09 December 2012 08:57:58PM 5 points [-]

What would that be in your opinion?

I'd say our biggest issue lately is lack of direction. The classic topics are getting kinda old now, and we don't really seem to be able to commit to any replacements. Anything in the sequences is pretty firmly established so nobody talks much about them anymore, and without them we kinda drift to things like the "rational" way to brush your teeth. If the site starts to feel watered down, I don't think it's because of new users, but because of shallow topics. Endless September is probably the biggest issue drawing us towards the mainstream.

People have theorized that the users who might post these discussions are too intimidated to post. Do you have additional theories, or do you think this is the problem, too?

I'm not really sure what the cause for this is, but I'd say that the above theory or general apathy on the part of some of the better contributors are the most likely.

How would you describe the problem? What would you suggest for ways to assess it?

Like I said before, the site's starting to feel watered down. It seems like the fire that drew us here is beginning to die down. It's probably just an effect of time letting the ideas settle in, but I still think we should be able to counter the problem if we're all we're cracked up to be.

I think it's really good that Eliezer is writing a new sequence, but I don't think he can support the community's ambition all on his own anymore. We need something new. Something that gets us at least as excited as the old sequences. Something that gets us back in the mood to take on the universe, blind idiot god and all.

I think that a lot of us just sort of settled back into our mundane lives once the high from thinking about conquering the stars wore off. I think we should find a way to feel as strong as we did once we realized how much of man's mind is malfunctioning and how powerful we would become if we could get past that. I really don't know if we can recapture that spirit, but if it's possible, then it shouldn't be harder than figuring out FAI.

Comment author: printing-spoon 09 December 2012 12:14:05AM *  30 points [-]

I think this site is dying because there's nothing interesting to talk about anymore. Discussion is filled with META, MEETUP, SEQ RERUN, links to boring barely-relevant articles, and idea threads where the highest comment has more votes than the thread itself (i.e. a crappy idea). Main is not much better. Go to archive.org and compare (date chosen randomly, aside from being a while ago). I don't think eternal september is the whole explanation here -- you only need 1 good user to write a good article.

Comment author: Viliam_Bur 09 December 2012 10:23:28PM *  10 points [-]

Discussion is filled with META, MEETUP, SEQ RERUN, links to boring barely-relevant articles

The website structure needs to be changed. "Main" and "Discussion" simply do not reflect the LW content today.

We should have a separate "Forum" (or some other name) category for all the non-article discussion threads like Open Thread, Media Thread, Group Rationality Thread, and stuff like this.

Then, the "Discussion" should be renamed to "Articles" (and possibly "Main" to "Main Articles") to make it obvious what belongs there.

Everything else should be downvoted; optionally with a comment: "This belongs to the Open Thread". (And if the author says they didn't know that Open Thread exists, there is something seriously wrong... about the structure of the website.)

I feel like I wrote this to the LW discussions at least dozen times...

there's nothing interesting to talk about anymore.

I think there are interesting things here. They are just drowned in too many less interesting things.

Let's look at the numbers: 6 articles so far on Dec 9th; 6 articles on Dec 8th; 4 articles on Dec 7th; 11 articles on Dec 6th; 8 articles on Dec 5th; and some of the articles from Dec 4th -- less than one week ago -- already don't fit on the first "Discussion" page. (The exact numbers may differ depending on your Preferences settings.) The page is scrolling insanely fast. If I stopped reading LW for one week, I would have problem to catch up with all the new stuff; I would probably just skip some of that. That's not good if we have too much stuff, but low average quality.

We don't downvote enough. Let me explain -- if someone makes a post that is not very good, but is not completely stupid or trolling also, it will almost certainly gain more upvotes that downvotes. Because it feels wrong to punish someone only for being uninteresting. But in terms of rewarding/punishing behavior, we probably should punish them. If we try to be too friendly, the site will become boring, precisely because most of the social talk is not about giving new information.

Perhaps it could help to use some timeless deciding. If you read an article, ask yourself a question: "If the next week here would be 10 new articles like this, would it make LW better or worse?" If the answer is worse, downvote it. Because although the same author will not write 10 more articles like this during the next week, other authors will.

TL;DR -- I think the Eternal September is most visible on the article level, because it is not obvious what kind of content belongs here. "Discussion" is horribly misleading -- we don't want discussion-level articles. That's what the comments and Open Threads are for.

Comment author: palladias 09 December 2012 01:59:41AM 8 points [-]

One issue with the LW/CFAR approach is that the focus is on getting better/more efficient at pursuing your goals, but not on deciding whether you're applying your newfound superpowers to the right goals. (There's a bit of this with efficient altruism, but those giving opportunities are more about moving people up Maslow's hierarchy of needs, not on figuring out what to want when you're not at subsistence level).

Luke's recent post suggest that almost no one here has the prereqs to tackle metaphysics or normative ethics, but that always has seemed like the obvious next topic for rationality-minded people. I was glad when Luke was writing his Desirism sequences back at CSA, but it never got to the point where I had a decent enough model of what normative claims desirism made to be able to evaluate it.

Basically, I think these topics would let us set our sights a little higher than "Help me optimize my computer use" but I think one major hurdle is that it's hard to tackle these topics in individual posts, and people may feel intimidated about starting sequences.

Comment author: Eugine_Nier 10 December 2012 03:35:25AM 3 points [-]

The problem is that there is an unfortunate tendency here, going all the way up to EY to dismiss philosophy and metaphysics.

Comment author: NancyLebovitz 10 December 2012 01:54:23PM *  2 points [-]

idea threads where the highest comment has more votes than the thread itself (i.e. a crappy idea)

It depends-- if the higher-voted comments are expending on the original post, then I'd say the post was successful because it evoked good-quality thought, assuming that the voters have good judgement. If the higher-voted comments are refuting the original post, then it was probably a bad post.

Comment author: Epiphany 09 December 2012 12:39:42AM 1 point [-]

Do you have a theory as to why there aren't enough good users, or why they are not writing good articles?

Comment author: John_Maxwell_IV 09 December 2012 07:18:56AM *  8 points [-]

One possibility is that the kind of content printing-spoon likes is easy to get wrong, and therefore easy to get voted down for, and therefore the system is set up with the wrong incentives (for the kind of content printing-spoon likes). I'd guess that for most users, the possibility of getting voted down is much more salient than the possibility of getting voted up. Getting voted down represents a form of semi-public humiliation (it's not like reddit, where if you post something lame it gets downvoted and consequentially becomes obscure).

The great scientists often make this error. They fail to continue to plant the little acorns from which the mighty oak trees grow. They try to get the big thing right off. And that isn't the way things go.

You and Your Research

See this thread for more: http://lesswrong.com/lw/5pf/what_were_losing/

Overall, I suspect that LW could stand to rely less on downvoting in general as a means of influencing user behavior. It seems like meta threads of this type often go something like "there's content X I hate, content Y I hate, and practically no content at all, really!" Well if you want more content, don't disparage the people writing content! It may make sense to moderate voting behavior based on how much new stuff is being posted--if hardly any new stuff is being posted, be more willing to upvote. If there's lots of stuff competing for attention, vote down lamer stuff so the good stuff gets the recognition it deserves.

I think we could stand to see high-karma LWers who rarely post in Main/Discussion post there more. Maybe make it impossible for anyone with over X karma to get voted below 0 in a Main or Discussion post. Or make a new subforum where high-karma users can post free of moderation. (I'll admit, the oligarchical aspect of this appeals to me.)

Also, maybe be realistic about the fact that most people are not going to be willing to go to lukeprog/gwern lengths to dig up papers related to their posts, and figure out the best way to live with that.

Comment author: Nominull 09 December 2012 06:35:32PM 1 point [-]

Well, I tried to make a post once, got downvoted into oblivion, and decided not to put myself through that again. So yeah this happens for real, although perhaps in my case it is no big loss.

Comment author: printing-spoon 09 December 2012 04:53:49AM *  1 point [-]

I'm not sure... I think the topics I find most interesting are simply used up (except for a few open questions on TDT or whatever). Also the recent focus on applied rationality / advice / CFAR stuff... this is a subject which seems to invite high numbers of low quality posts. In particular posts containing advice are generally stuffed with obvious generalizations and lack arguments or evidence beyond a simple anecdote.

Also, maybe the regular presence of EY's sequences provided a standard for quality and topic that ensured other people's posts were decent (I don't think many people read seq reruns, especially not old users who are more likely to have good ideas).

Comment author: [deleted] 09 December 2012 05:25:11AM 1 point [-]

See my comment for one possibility.

Comment author: JoshuaFox 09 December 2012 07:24:20AM *  15 points [-]
  1. Raise the karma threshold for various actions.
  2. Split up the various SEQ RERUN, META, MEETUP into "sub-LWs" so that those who are not interested do not need to see it.
  3. Likewise, split up topics: applied rationality, FAI, and perhaps a few others. There can still be an overview page for those who want to see everything.
  4. Perhaps this is offtopic, but add an email-notification mechanism for the inbox. This would reduce the need to keep coming back to look for responses, and so reduce the annoyance level.
Comment author: devas 09 December 2012 01:05:49PM 6 points [-]

I agree strongly with # 2,3 and 4

Particularly 2, since the absence of category divisions makes all discussion harder to browse....at least for me

Comment author: Epiphany 14 December 2012 07:51:00AM *  5 points [-]

It has occurred to me that LessWrong is divided against itself with two conflicting directives:

  1. Spread rationality.
  2. Be a well-kept garden.

Spreading rationality implies helping as many new people as possible develop improved rational thinking abilities but being a well-kept garden specifically demands censorship and/or bans of "fools" and people who are not "fun".

"A house divided against itself cannot stand." (Lincoln)

I think this fundamental conflict must be solved in some way. If not, then the risk is that LessWrong's discussion area will produce neither of those outcomes. If it fills with irrational people, the rational ones will go elsewhere and the irrational people won't spread rationality to themselves. They will instead most likely adopt some superficial version of it reminiscent of Feynman's descriptions of cargo cult science or Eliezer's descriptions of undiscriminating skeptics.

Perhaps there's some article from Eliezer I'm unaware of that says something to the effect of "The discussion is supposed to be where the rational people produce rational thought and everyone else can lurk and that's how rationality can be spread." If so, I hope that this is pointed out to me.

Without some clear explanation of how LessWrong is supposed to both spread rationality and be a well-kept garden, we're likely to respond to these directives inadequately.

Comment author: RichardKennaway 14 December 2012 09:23:38AM 4 points [-]

Every school has this problem: how to welcome people who as yet know little and raise them up to the standard we want them to reach, while allowing those already there to develop further. Universities solve this with a caste distinction between the former (students) and the latter (faculty), plus a few bridging roles (grad student, intern, etc.). On a much smaller scale, the taiko group I play with has found the same problem of dividing beginners from the performing team. It doesn't work to have one class that combines introductory practice with performance rehearsal. And there can be social problems of people who simply aren't going to improve getting disgruntled at never being invited to join the performing team.

In another comment I suggested that this division already exists: LessWrong and CFAR. So the question is, does LessWrong itself need a further splitting between welcoming beginners and a "serious" inner circle? Who would be the advanced people who would tend the beginners garden? How would membership in the inner circle be decided?

Comment author: [deleted] 15 December 2012 06:12:39AM *  2 points [-]

Missions, perhaps? A few ideas: "We are rationalists, ask us anything" as an occasional post on reddit. Drop links and insightful comments around the internet where interesting people hang out.

Effect #1 is to raise the profile of rationality in the internet community in general, so that more people become interested. Effect #2 is that smart people click on our links and come to LW. I myself was linked to LW at first by a random link dropped in r/transhumanism or something. I immediately recognized the awesomeness of LW, and ate the sequences.

On the home front, I think we should go whole hog on being a well kept garden. Here's why:

  1. There's no such thing as a crowd of philosophers. A movement should stay small and high quality as long as possible. The only way to maintain quality is to select for quality.

  2. There are a lot of people out there, such that we could select for any combination of traits we liked and be unlikely to run out of noobs. We will have a much easier time at integration and community maintenance if we focused on only attracting the right folks.

I don't think we have to worry about creating rationalists from normals. There are enough smart proto-rationalists out there just itching to find something like LW, that all we have to do is find them, demonstrate our powers, and point them here. We should focus on collecting rationalists, not creating them. (Is there anyone for whom this wouldn't have worked? Worse, is there any major subset of good possible LWers that this turns off?)

As for integrating new people, I think the right people will find a way and it's ok if everyone else gets turned off. This might be pure wishful thinking. What are other people's thoughts on this?

Overall, have the low level missionary work happen out there where it belongs. Not in these hallowed halls.

As for what to do with these hallowed halls, here's my recommendations:

  1. Elect or otherwise create an Official Community Organizer who's job it is to integrate all the opinions and make the decisions about the direction of LW. I think they would also provide direct friendly encouragement to the meetup organizers, who are currently totally lacking in coordination and support.

  2. Sort out this crazy discussion/main bullshit. The current setup has very few desirable properties. I don't know what the solution should be, but we should at least be trying things. This of course requires someone to come up with ideas and code them. Would it be bad to try a different arrangement for a month?

  3. Fix the front page. The valuable stuff there is approximately the banner, "featured posts", and current activity links. Everything else is of dubious value. The LW front page should be slick. Right now it looks like it was designed by a committee, and probably turns off most of our potentials.

  4. Properly organize and index the LW material. This is a pretty big project; LW has thousands of good posts. This project neatly fits in with and builds on the current work in the wiki. The goal is a single root page from which every major insight is linked in at least a passable reading order. Like a textbook TOC. This obviously would benefit from wiki-improvements in general, for which I recommend merging wiki and LW accounts, and making wiki activity more visible and encouraged, among other things.

  5. Friendship threads where we pick a partner and get to know them. Would generally increase community-coherence and civility. After meeting a lot of the other posters at the CFAR minicamp, I get more friendly feels in the community.

  6. Somehow come up with the funding and political will to do all this stuff.

  7. Something about trolls and idiots. Is this even a problem once the above are solved?


As for you, Epiphany, I want to commend you for still being at the throat of this problem, and still generating ideas and analysis. I'm impressed and humbled. Keep up the good work.

Comment author: Nominull 09 December 2012 12:32:37AM 25 points [-]

The destruction of LW culture has already happened. The trigger was EY leaving, and people without EY's philosophical insight stepping in to fill the void by chatting about their unconventional romantic lives, their lifehacks, and their rational approach to toothpaste. If anything, I see things having gotten somewhat better recently, with EY having semi-returned, and with the rise of the hypercontrarian archconservative clique, which might be wrong about everything but at least they want to talk about it and not toothpaste.

Comment author: John_Maxwell_IV 09 December 2012 06:35:29AM 5 points [-]

A related request: there are a lot of goals common enough that better achieving these goals should be of interest to a large-ish portion of LW. I'm thinking here of: happiness; income; health; avoiding auto accidents; reading more effectively; building better relationships with friends, family, dating partners, or co-workers; operationalizing one's goals to better track progress; more easily shedding old habits and gaining new ones.

Could we use our combined knowledge base, and our ability to actually value empirical data and consider counter-evidence and so on, to find and share some of the better known strategies for achieving these goals? (Strategies that have already been published or empirically validated, but that many of us probably haven’t heard?) We probably don’t want to have loads and loads of specific-goaled articles or links, because we don’t want to look like just any old random internet self-help site. But a medium amount of high-quality research, backed by statistics, with the LW-community’s help noticing the flaws or counter-arguments -- this sounds useful to me. Really useful. Much of the advantage of rationality comes from, like, actually using that rationality to sort through what’s known and to find and implement existing best practices. And truth being singular, there’s no reason we should each have to repeat this research separately, at least for the goals many of us share.

Anna Salamon, 2009. So this "destruction" was at least semi-planned.

Comment author: Epiphany 11 December 2012 02:24:48AM *  1 point [-]

I read that twice, and went to the post you linked to, and am still not seeing why it supports the idea:

this "destruction" was at least semi-planned.

Maybe you are viewing optimization related posts as a form of cultural collapse?

Comment author: John_Maxwell_IV 11 December 2012 03:20:39AM 1 point [-]

Nominull seemed to be. I was patterning my use of "destruction" after theirs. I don't see it as destruction myself.

Comment author: [deleted] 09 December 2012 04:59:05AM 2 points [-]

hypercontrarian archconservative clique

lulz. Why do I feel identity-feels for that phrase? I should watch out for that, but,

which might be wrong about everything

That's what I thought a few months ago. Then everything turned inside out and I <metaphor> realized there is no god </metaphor>. What a feeling! Now I see people confidently rationalizing the cultural default, and realize how far we have to go WRT epistemic rationality.

Comment author: [deleted] 09 December 2012 12:46:23AM 2 points [-]

If EY didn't intend for said "destruction" to happen, he should have chosen a website model more suitable to that end.

Comment author: metatroll 15 December 2012 10:01:27AM 0 points [-]

tl;dr: The following is a non-profit fan-based parody. Less Wrong, the Singularity Institute, and the Centre for Applied Rationality are owned by Hogwarts School, Chancellor Ray Kurzweil, and the Bayesian Conspiracy. Please support the official release.

Troll Wrongosphers with Baumeister and Eddington, not Benedict and Evola

Wrongosophical trolling should be based on genuinely superior psychological insights ("Baumeister" for breakthroughs in social psychology such as those summarized in Vohs & Baumeister 2010) and on crackpot science that is nonetheless difficult to debunk ("Eddington" for the fundamental theory described in Durham 2006). Starting from reaction and religion, as many trolls still do, both (1) promotes unpleasant ideas like God and conservatism and (2) fails to connect with the pragmatic and progressive sensibility of 21st-century culture. Once young trollosophers are equipped with some of the best newthink and pseudoscience, then let them dominate the subversive paradigm. I'll bet they get farther than the other kind.

Comment author: Alicorn 09 December 2012 12:20:22AM 29 points [-]

So far, I've been more annoyed on LessWrong by people reacting to fear of "cultural erosion" than by any extant symptoms of same.

Comment author: Vaniver 09 December 2012 09:03:54PM 8 points [-]

The fear is that this is due to a selection effect. Of the people I know through LW, a disappointing number have stopped reading the site. One of my hobbies, for over a decade now, has been posting on forums, and so the only way I'd stop reading / posting on LW is if I find a forum more relevant to my interests. (For the curious, I've moved from 3rd edition D&D to xkcd to here over that timeframe, and only post in xkcd's MLP and gaming threads these days.) For many of the former LWers I know, forum-posting isn't one of their hobbies, and they came here for the excellent content, primarily by EY. Now that there aren't blog posts that they want to read frequently enough, they don't come, and I'm not sure that any of them even know that EY has started posting a new sequence.

I think that this fear is mostly misplaced, because the people in that class generally aren't the people posting the good content, and I think any attempt to improve LW should be along the lines of "more visible good content" and not "less bad content," but it's important for evaporative cooling reasons to periodically assess the state of content on LW.

Comment author: David_Gerard 10 December 2012 12:30:30AM 4 points [-]

Not only do communities have a life cycle, people's membership in communities does. People give all sorts of reasons for leaving a community (e.g. boredom, other interests, deciding the community is full of assholes, an incident they write over one megabyte of text complaining about), but the length of participation is typically 12 to 18 months regardless. Anything over that, you're a previous generation.

So I wouldn't be disappointed unless they stopped before 12-18 months.

Comment author: Viliam_Bur 09 December 2012 10:35:39PM *  2 points [-]

any attempt to improve LW should be along the lines of "more visible good content" and not "less bad content,"

Why not both?

Speaking for myself, a lot of bad content would make me less likely to post good content. My instincts tell me -- if other people don't bother here with quality, why should I?

Comment author: Vaniver 09 December 2012 11:28:15PM *  0 points [-]

Why not both?

I separate those because I think the second is a distraction. It seems to me that the primary, and perhaps only, benefit from reducing bad content is increasing the visibility of good content.

Speaking for myself, a lot of bad content would make me less likely to post good content. My instincts tell me -- if other people don't bother here with quality, why should I?

It still seems like there are incentives- better posts will yield more karma- and I suspect it matters who the other people who don't bother are. Right now, we have spammers (particularly on the wiki) who don't bother at all with being helpful. Does that make you more likely to post commercial links on the wiki? If everyone you thought was more insightful than you stopped bothering to write posts and comments, then it seems likely that you would wonder what the benefit to putting more effort into LW was. More high quality posts seems useful as an aspirational incentive.

Comment author: Epiphany 09 December 2012 11:14:32PM *  1 point [-]

I wonder if it would make a big difference to add email notifications... perhaps the type where you only receive the notification when something over X number of karma is posted?

That would keep users from forgetting about the site entirely. And draw more attention and karma (aka positive reinforcement) to those who post quality things.

Hmm that would also keep older users logging in, which would help combat both trending toward the mean and new users outstripping acculturation capacity.

Comment author: NancyLebovitz 10 December 2012 02:00:50PM 1 point [-]

I think that would bring back only the most marginally interested users, and would be likely to annoy a good many people who'd drifted away.

Notification of posts with karma above a chosen threshold might be better.

For that matter, a customizable LW which you could choose to only see posts with karma above a threshold might be good. It would be even better if posts could also be selected/deselected by subject, but that sounds like a hard problem.

Comment author: Viliam_Bur 09 December 2012 10:40:54PM 2 points [-]

I am annoyed by both. Not enough to consider leaving this weedy garden yet.

Comment author: Kindly 09 December 2012 04:59:57PM 0 points [-]

Same here, but I have no clue how to address this problem. I suspect making discussion posts complaining about people complaining about cultural erosion would be the wrong approach.

Comment author: Epiphany 09 December 2012 11:18:27PM *  0 points [-]

(nevermind)

Comment author: metatroll 09 December 2012 01:40:06AM *  24 points [-]

The Popular Struggle Committee for Salvation of Less Wrong calls for the immediate implementation of the following measures:

1) Suspension of HPMOR posting until the site has been purged. All new users who join during the period of transition will be considered trolls until proven otherwise. Epiphany to be appointed Minister of Acculturation.

2) A comprehensive ban on meta-discussion. Articles and comments in violation of the ban will be flagged as "meta" by the moderators, and replying to them will incur a "meta toll" of -5 karma. A similar "lol toll" shall apply to jokes that aren't funny.

3) All meetups for the next six months to consist of sixty minutes of ideological self-criticism and thirty minutes of weapons training.

Comment author: gwern 09 December 2012 03:04:46AM 2 points [-]

I second these motions... with a vengeance. For is it not said:

The revolutionary war is a war of the masses; it can be waged only by mobilizing the masses and relying on them.

and

Liberalism is extremely harmful in a revolutionary collective. It is a corrosive which eats away unity, undermines cohesion, causes apathy and creates dissension. It robs the revolutionary ranks of compact organization and strict discipline, prevents policies from being carried through and alienates the Party organizations from the masses which the Party leads. It is an extremely bad tendency.

and especially:

To criticize the people's shortcomings is necessary, . . . but in doing so we must truly take the stand of the people and speak out of whole-hearted eagerness to protect and educate them.

Comment author: OrphanWilde 10 December 2012 05:28:12PM 3 points [-]

This post reminds me of Eliezer's own complaints against Objectivism; that Ayn Rand's ingroup became increasingly selective as time went on, developing a self-reinforcing fundamentalism.

As I wrote in one of my blogs a while back, discussing another community that rejects newcomers:

"This is a part of every community. A community which cannot or will not do this is crippled and doomed, which is to say, it -is- their jobs to [teach new members their mores]. This is part of humanity; we keep dying and getting replaced, and training our replacements is a constant job. We cannot expect that people should "Just know" the right way to behave, we have to teach them that, whether they're twelve, twenty two, or eighty two"

Comment author: Nick_Tarleton 12 December 2012 05:03:45AM *  6 points [-]

An elite intellectual community can^H^H^H has to mostly reject newcomers, but those it does accept it has to invest in very effectively (while avoiding the Objectivist failure mode).

I think part of the problem is that LW has elements of both a ground for elite intellectual discussion and a ground for a movement, and these goals seem hard or impossible to serve with the same forum.

I agree that laziness and expecting people to "just know" is also part of the problem. Upvoted for the quote.

Comment author: katydee 14 December 2012 04:46:28AM 2 points [-]

I'm not entirely sure that expecting people to "just know" is a huge problem here, as on the Internet appropriate behavior can be inferred relatively easily by reading past posts and comments-- hence the common instruction to "lurk more."

One could construe this as a filter, but if so, who is it excluding? People with low situational awareness?

Comment author: [deleted] 09 December 2012 05:23:02AM *  6 points [-]

I think we could use more intellectual productivity. I think we already have the capacity for a lot more. I think that would do a lot against any problems we might have, Obviously I am aware of the futility of the vague "we" in this paragraph, so I'll talk about what I could do but don't.

I have a lot of ideas to write up. I want to write something on "The improper use of empathy", something about "leading and following", something about social awkwardness from the inside. I wrote an article about fermi estimation that I've never posted. And some other ideas that I can't remember right now. I'll admit I have one meta-essay in here somewhere too. "Who's in charge here?"

I don't write as much for LW as I could, because I feel like a mere mortal among gods. I feel kindof inadequate, like I would be lowering the level of discussion around here. Ironically, the essays that I do post are all quite well upvoted, and not posting may be one source of the lowered quality of LW.

I may not be the only one.

EDIT: this is also why I post to discussion and not main.

Comment author: John_Maxwell_IV 09 December 2012 06:47:06AM *  6 points [-]

Yep, that's my experience as well. Recently, I decided "screw what LW thinks" and started posting more thoughts of mine, and they're all getting upvoted. My vague intuitions about how many upvotes my posts will get doesn't seem to correlate very well with how many upvotes they actually get either. This is probably true for other people as well.

The only potential problem with this, IMO, is if people think I'm more of an authoritative source than I actually am. I'm just sharing random thoughts I have; I don't do scholarly work like gwern.

Comment author: Epiphany 11 December 2012 02:34:31AM *  1 point [-]

It seems to me that there are lots and lots of people who want to write posts but they're concerned about whether those posts will be received well. I've read, also, that more people put "public speaking" as their worst fear than "death" when surveyed. If we made a karma prediction tool, maybe that would help get people posting here. Here's what I'm thinking:

First, we could create a checklist of the traits that we think will get a LessWrong post upvoted. For instance:

  • Is there an obvious main point or constructive goal?
  • Is the main point supported / is there a reasonable plan for the constructive goal? (Or are they otherwise framed in the correct context "This is hypothetical" or whatever.)
  • What type of support is included (math, citations, graphics, etc).
  • Was the topic already covered?
  • Is it a topic of interest to LessWrong?
  • Is it uplifting or unhappy?
  • (Or do a separate survey that asks people's reasons for upvoting / downvoting and populate the checklist with those.)

Then we could post the checklist as a poll in each new post and article for a while.

Then we could correlate the karma data with the checklist poll data and test it to see how accurately it predicts a post's karma.

If you had a karma prediction tool, would it help you post more?

Submitting...

Comment author: satt 24 February 2013 09:04:45PM 0 points [-]

Posting that checklist as a poll in each new post would likely end up irritating people.

A simpler approach, with the twin advantages of being simpler and being something one can do unilaterally, would be to just count the proportion of recent, non-meetup-related Discussion posts with positive karma. Then you could give potential post authors an encouraging reference class forecast like "85% of non-meetup Discussion posts get positive karma".

Comment author: Epiphany 24 February 2013 09:51:32PM 1 point [-]

You know what? That is simple and elegant. I like that about it... but in the worst case scenario, that will encourage people to post stuff without thinking about it because they'll make the hasty generalization that "All non-meetup posts have an 85% chance of getting some karma" and even in the best case scenario, a lot of people will probably be thinking something along the lines of "Just because Yvain and Gwern and people who are really good at this get positive karma doesn't mean that I will."

Unfortunately, I think it would be ineffective.

Comment author: satt 24 February 2013 11:30:17PM 1 point [-]

Fair points.

Comment author: Epiphany 11 December 2012 02:39:50AM *  4 points [-]

I don't feel inadequate but I do feel likely to get jumped all over for mistakes. I've realized that you really need to go over things with a fine-toothed comb, and that there are countless cultural peculiarities that are, for me, unexpected.

I've decided that the way I will feel comfortable posting here is to carefully word my point, make sure that point is obvious to the reader, identify and mentally outline any other claims in the piece, and make sure every part is supported and then (until I get to know the culture better) ask someone to check it out for spots that will be misunderstood.

That has resulted in me doing a lot of research. So now my main bottleneck is that I feel like posting something requires doing a lot of research. This is well and good IMO, but it means I won't post anywhere near as much simply because it takes a lot of time.

I've wondered if it would do us good to form a writer's group within LW where people can find out what topics everyone else is interested in writing about (which would allow them to co-author, cutting the work in half), see whether there are volunteers to do research for posts, and get a "second pair of eyes" to detect any karma-destroying mistakes in the writings before they're posted.

A group like this would probably result in more writing.

Comment author: [deleted] 12 December 2012 04:19:27AM 2 points [-]

A group like this would probably result in more writing.

That's a really good idea.

Let me know when you've organized something.

Comment author: Epiphany 12 December 2012 07:47:01AM 0 points [-]

(: I do not have time to organize this currently. I'm not even sure I will have time to post on LessWrong. I have a lot of irons on the fire. :/

I would sure love to run a LW writer's group though, that would be awesome. Inevitably, it would be pointed out that I am not an expert on LW culture. If things slow down, and I do not see anyone else doing this, I may go for it anyway.

Comment author: [deleted] 12 December 2012 01:48:07PM 6 points [-]

(:

I can no longer hold my tongue. Your smileys are upside-down, and the tiny moments of empathetic sadness when my eyes haven't sorted out which side of the parens the colon is on are really starting to add up. :)

Comment author: Epiphany 12 December 2012 09:03:07PM 1 point [-]

Rofl. I am not sure if this is supposed to get me to stop, or get me to laugh.

Comment author: [deleted] 12 December 2012 10:08:35PM 1 point [-]

Even in the same comment, you don't orient your smileys the same way. Just saying...

Comment author: Armok_GoB 13 December 2012 12:20:04AM 1 point [-]

I have like 10 different articles I'd like to submit to this, many of which have been on ice for literally years!

Comment author: Epiphany 13 December 2012 08:37:34PM *  3 points [-]

What are your reasons for postponing? More interestingly, what would get you to post them? Would the writer's group as described above do it, or this other suggestion here?

Would something else help?

Comment author: Armok_GoB 14 December 2012 02:18:13AM *  2 points [-]

Being absolutely, utterly terrible at writing. Being utterly incapable of clear communication. Being a sloppy thinker incapable of formalizing and testing all the awesome theories I come up with.

Being rather shy and caring very very much about the opinions of this community, and very insecure in my own abilities, fearing ridicule and downvotes.

Other than that I am extremely motivated to share all these insights I think might be extremely valuable to the world, somehow.

The suggestion mentioned wouldn't help at all. Really, anything radial enough will look less like fixing something I've written, and more like me explaining the idea and someone else writing an article about it with me pointing out miscommunications.

Comment author: [deleted] 09 December 2012 05:52:02AM 16 points [-]

Here's two things we desperately need:

  1. An authoritative textbook-style index/survey-article on eveything in LW. We have been generating lots of really cool intellectual work, but without a prominently placed, complete, hierarchical, and well-updated overview of "here's the state of what we know", we arent accumulating knowledge. This is a big project and I don't know how I could make it happen, besides pushing the idea, which is famously ineffective.

  2. LW needs a king. This idea is bound to be unpopular, but how awesome would it be to have someone who's paid job it was to make LW into an awesome and effective community. I imagine things like getting proper studies done of how site layout/design should be to make LW easy to use and sticky to the right kind of people (currently sucks), contacting, coordinating, and encourageing meetup organizers individually (no one does this right now and lw-organizers has little activity), thinking seriously and strategically about problems like OP, and leading big projects like idea #1. Obviously this person would have CEO-level authority.

One problem is that our really high-power agent types who are super dedicated to the community (i.e. lukeprog) get siphoned off into SI. We need another lukeprog or someone to be king of LW and deal with this kind of stuff.

Without a person in this king role, the community has to waste time and effort making community-meta threads like these. Communities and democratic methods suck at doing the kind of strategic, centralized, coherent decision making that we really need. It really isn't the comparative advantage of the community to be having to manage these problems. If these problems were dealt with, it would be a lot easier to focus on intellectual productivity.

Comment author: Eugine_Nier 09 December 2012 06:46:04PM *  8 points [-]

LW needs a king.

The standard term is Benevolent Dictator for Life, and we already have one. What you're asking for strikes me as more of a governor-general.

Comment author: [deleted] 09 December 2012 07:49:41PM 22 points [-]

Our benevolent dictator isn't doing much dictatoring. If I understand correctly that it's EY, he has a lot more hats to wear, and doesn't have the time to do LW-managing full time.

Is he willing to improve LW, but not able? Then he is not a dictator.
Is he able, but not willing? Then he is not benevolent.
Is he both willing and able? Then whence cometh suck?
Is he neither willing nor able? Then why call him God?

As with god, If we observe a lack of leadership, it is irrelevant whether we nominally have a god-emperor or not. The solution is always the same: Build a new one that will actually do the job we want done.

Comment author: Eliezer_Yudkowsky 10 December 2012 06:50:12AM 10 points [-]

Okay, that? That was one of the most awesome predicates of which I've ever been a subject.

Comment author: Epiphany 10 December 2012 08:53:59PM *  0 points [-]

You're defending yourself against accusations of being a phyg leader over there and over here, you're enjoying a comment that implies that either the commenter, or the people the commenter is addressing perceive you as a god? And not only that, but this might even imply that you endorse the solution that is "always the same" of "building a new one (god-emperor)".

Have you forgotten Luke's efforts to fight the perceptions of SI's arrogance?

That you appear to be encouraging a comment that uses the word god to refer to you in any way, directly or indirectly, is pretty disheartening.

Comment author: Eliezer_Yudkowsky 10 December 2012 09:06:28PM 6 points [-]

I tend to see a fairly sharp distinction between negative aspects of phyg-leadership and the parts that seem like harmless fun, like having my own volcano island with a huge medieval castle, and sitting on a throne wearing a cape saying in dark tones, "IT IS NOT FOR YOU TO QUESTION MY FUN, MORTAL." Ceteris paribus, I'd prefer that working environment if offered.

Comment author: Epiphany 11 December 2012 01:56:14AM *  1 point [-]

And how are people supposed to make the distinction between your fun and signs of pathological narcissism? You and I both know the world is full of irrationality, and that this place is public. You've endured the ravages of the hatchet job and Rationalwiki's annoying behaviors. This comment could easily be interpreted by them as evidence that you really do fancy yourself a false prophet.

What's more is that I (as in someone who is not a heartless and self-interested reporter, who thinks you're brilliant, who appreciates you, who is not some completely confused person with no serious interest in rationality) am now thinking:

How do I make the distinction between a guy who has an "arrogance problem" and has fun encouraging comments that imply that people think of him as a god vs. a guy with a serious issue?

Comment author: fubarobfusco 11 December 2012 06:30:18AM 5 points [-]

Try working in system administration for a while. Some people will think you are a god; some people will think you are a naughty child who wants to be seen as a god; and some people will think you are a sweeper. Mostly you will feel like a sweeper ... except occasionally when you save the world from sin, death, and hell.

Comment author: Epiphany 11 December 2012 09:30:14AM 1 point [-]

I feel the same way as a web developer. One day I'm being told I'm a genius for suggesting that a technical problem might be solved by changing a port number. The next day, I'm writing a script to compensate for the incompetent failures of a certain vendor.

When people ask me for help, they assume I can fix anything. When they give me a project, they assume they know better how to do it.

Comment author: ChristianKl 11 December 2012 02:04:04PM 1 point [-]

The only way to decide whether someone has a serious issue is to read a bunch from them and then see which patterns you find.

Comment author: wedrifid 11 December 2012 05:32:53AM 1 point [-]

And how are people supposed to make the distinction between your fun and signs of pathological narcissism?

I don't see this as a particular problem in this instance. The responses are of the form that if anything an indication that he isn't taking himself too seriously. The more pathologically narcissistic type tend to be more somber about their power and image.

No, if there was a problem here it would be if the joke was in poor taste. In particular if there were those that had been given the impression that Eliezer's power or Narcissism really was corrupting his thinking. If he had begun to use his power arbitrarily on his own whim or if his arrogance had left him incapable of receiving feedback or perceiving the consequences his actions have on others or even himself. Basically, jokes about how arrogant and narcissistic one is only work when people don't perceive you as actually having problems in that regard. If you really do have real arrogance problems then joking that you have them while completely not acknowledging the problem makes you look grossly out of touch and socially awkward.

For my part, however, I don't have any direct problem with Eliezer appreciating this kind of reasoning. It does strike me as a tad naive of him and I do agree that it is the kind of thing that makes Luke's job harder. Just... as far as PR missteps made by Eliezer this seems so utterly trivial as to be barely worth mentioning.

How do I make the distinction between a guy who has an "arrogance problem" and has fun encouraging comments that imply that people think of him as a god vs. a guy with a serious issue?

The way I make such distinctions is to basically ignore 'superficial arrogance'. I look at the real symptoms. The ones that matter and have potential direct consequences. I look at their ability to comprehend the words of others---particularly those others without the power to 'force' them to update. I look at how much care they take in exercising whatever power they do have. I look at how confident they are in their beliefs and compare that to how often those beliefs are correct.

Comment author: bogus 11 December 2012 02:42:39AM 1 point [-]

over here, you're enjoying a comment that implies that either the commenter, or the people the commenter is addressing perceive you as a god?

I have to agree with Eliezer here: this is a terrible standard for evaluating phygishness. Simply put, enjoying that kind of comment does not correlate at all with what the harmful features of phygish organizations/social clubs, etc. are. There are plenty of Internet projects that refer to their most prominent leaders with such titles as God-King, "benevolent dictator" and the like; it has no implication at all.

Comment author: Epiphany 12 December 2012 07:59:43AM *  0 points [-]

You have more faith than I do that it will not be intentionally or unintentionally misinterpreted.

Also, I am interpreting at that comment within the context of other things. The "arrogance problem" thread, the b - - - - - - k, Eliezer's dating profile, etc.

What's not clear is whether you or I are more realistic when it comes to how people are likely to interpret, in not only a superficial context (like some hatchet jobbing reporter who knows only some LW gossip), but with no context, or within the context of other things with a similar theme.

Comment author: [deleted] 15 December 2012 05:12:02AM *  0 points [-]

srsly, brah. I think you misunderstood me.

you're enjoying a comment that implies that either the commenter, or the people the commenter is addressing perceive you as a god?

I was drawing an analogy to Epicurus on this issue because the structure of the situation is the same, not because anyone perceives (our glorious leader) EY as a god.

And not only that, but this might even imply that you endorse the solution that is "always the same" of "building a new one (god-emperor)".

I bet he does endorse it. His life's work is all about building a new god to replace the negligent or nonexistent one that let the world go to shit. I got the idea from him.

Comment author: Epiphany 15 December 2012 06:28:52AM *  1 point [-]

srsly, brah. I think you misunderstood me.

My response was more about what interpretations are possible than what interpretation I took.

I was drawing an analogy to Epicurus on this issue because the structure of the situation is the same, not because anyone perceives (our glorious leader) EY as a god.

Okay. There's a peculiar habit in this place where people say things that can easily be interpreted as something that will draw persecution. Then I point it out, and nobody cares.

I bet he does endorse it. His life's work is all about building a new god to replace the negligent or nonexistent one that let the world go to shit. I got the idea from him.

Okay. It probably seems kind of stupid that I failed to realize that. Is there a post that I should read?

Comment author: [deleted] 15 December 2012 07:12:05AM *  1 point [-]

Okay. There's a peculiar habit in this place where people say things that can easily be interpreted as something that will draw persecution. Then I point it out, and nobody cares.

This is concerning. My intuitions suggest that it's not a big deal. I infer that you think it's a big deal. Someone is miscalibrated.

Do you have a history with persecution that makes you more attuned to it? I am blissfully ignorant.

Okay. It probably seems kind of stupid that I failed to realize that. Is there a post that I should read?

I don't know if there's an explicit post about it. I picked it up from everything on Friendly AI, the terrible uncaringness of the universe, etc. It is most likely not explicitly represented as replacing a negligent god anywhere outside my own musings, unless I've forgotten.

Comment author: Epiphany 15 December 2012 08:43:42AM *  2 points [-]

This is concerning. My intuitions suggest that it's not a big deal. I infer that you think it's a big deal. Someone is miscalibrated.

I really like this nice, clear, direct observation.

Do you have a history with persecution that makes you more attuned to it? I am blissfully ignorant.

Yes, but more relevantly, humanity has a history with persecution - lots of intelligent people and people who want to change the world from Socrates to Gandhi have been persecuted.

Here Eliezer is in a world full of Christians who believe that dreaded Satan is going to reincarnate soon, claim to be a God, promise to solve all the problems, and take over earth. Religious people have been known to become violent for religious reasons. Surely building an incarnation of Satan would, if that were their interpretation of it, qualify as more or less the ultimate reason to launch a religious war. These Christians outnumber Eliezer by a lot. And Eliezer, according to you, is talking about building WHAT?

My take on the "build a God-like AI" idea is that it is pretty crazy. I might like this idea less than the Christians probably do seeing as how I don't have any sense that Jesus is going to come back and reconstruct us after it does it's optimization...

I don't know if there's an explicit post about it. I picked it up from everything on Friendly AI, the terrible uncaringness of the universe, etc. It is most likely not explicitly represented as replacing a negligent god anywhere outside my own musings, unless I've forgotten.

I went out looking for myself and I just watched the bloggingheads video (6:42) where Robert Wright says to Eliezer "It sounds like what you're saying is we need to build a God" and Eliezer is like "Why don't we call it a very powerful optimizing agent?" and grins like he's just fooled someone and Robert Wright thinks and he's like "Why don't we call that a euphemism for God?" which destroys Eliezer's grin.

If Eliezer's intentions are to build a God, then he's far less risk-averse than the type of person who would simply try to avoid being burned at the stake. In that case the problem isn't that he makes himself look bad...

Comment author: wedrifid 15 December 2012 02:46:54PM *  5 points [-]

I went out looking for myself and I just watched the bloggingheads video (6:42) where Robert Wright says to Eliezer "It sounds like what you're saying is we need to build a God" and Eliezer is like "Why don't we call it a very powerful optimizing agent?" and grins like he's just fooled someone

Like he's just fooled someone? I see him talking like he's patiently humoring an ignorant child who is struggling to distinguish between "Any person who gives presents at Christmas time" and "The literal freaking Santa Claus, complete with magical flying reindeer". He isn't acting like he has 'fooled' anyone or acting in any way 'sneaky'.

and Robert Wright thinks and he's like "Why don't we call that a euphemism for God?" which destroys Eliezer's grin.

While I wouldn't have been grinning previously whatever my expression had been it would change in response to that question in the direction of irritation and impatience. The answer to "Why don't we call that a euphemism for God?" is "Because that'd be wrong and totally muddled thinking". When your mission is to create an actual very powerful optimization agent and that---and not gods---is actually what you spend your time researching then a very powerful optimization agent isn't a 'euphemism' for anything. It's the actual core goal. Maybe, at a stretch, "God" can be used as a euphemism for "very powerful optimizing agent" but never the reverse.

I'm not commenting here on the question of whether there is a legitimate PR concern regarding people pattern matching to religious themes having dire, hysterical and murderous reactions. Let's even assume that kind of PR concern legitimate for the purpose of this comment. Even then there is a distinct difference between "failure to successfully fool people" and "failure to educate fools". It would be the latter task that Eliezer has failed at here and the former charge would be invalid. (I felt the paragraph I quoted to be unfair on Eliezer with respect to blurring that distinction.)

Comment author: [deleted] 15 December 2012 06:26:18PM 2 points [-]

I really like this nice, clear, direct observation.

Thank you. I will try to do more of that.

Here Eliezer is in a world full of Christians who believe that dreaded Satan is going to reincarnate soon, claim to be a God, promise to solve all the problems, and take over earth. Religious people have been known to become violent for religious reasons. Surely building an incarnation of Satan would, if that were their interpretation of it, qualify as more or less the ultimate reason to launch a religious war. These Christians outnumber Eliezer by a lot. And Eliezer, according to you, is talking about building WHAT?

Interesting. Religious people seem a lot less scary to me than this. My impression is that the teeth have been taken out of traditional christianity. There are a few christian terrorists left in north america, but they seem like holdouts raging bitterly against the death of their religion. They are still in the majority in some places, though, and can persecute people there.

I don't think that the remains of theistic christianity could reach an effective military/propoganda arm all the way to Berkely even if they did somehow misinterpret FAI as an assault on God.

Nontheistic christianity, which is the ruling religion right now could flex enough military might to shut down SI, but I can't think of any way to make them care.

I live in Vancouver, where as far as I can tell, most people are either non-religious, or very tolerant. This may affect my perceptions.

My take on the "build a God-like AI" idea is that it is pretty crazy. I might like this idea less than the Christians probably do seeing as how I don't have any sense that Jesus is going to come back and reconstruct us after it does it's optimization...

This is a good reaction. It is good to take seriously the threat that an AI could pose. However, the point of Friendly AI is to prevent all that and make sure it that if it happens, it is something we would want.

Comment author: Nornagest 15 December 2012 08:20:05AM *  0 points [-]

I picked it up from everything on Friendly AI, the terrible uncaringness of the universe, etc. It is most likely not explicitly represented as replacing a negligent god anywhere outside my own musings, unless I've forgotten.

I'm not sure I've heard any detailed analysis of the Friendly AI project specifically in those terms -- at least not any that I felt was worth my time to read -- but it's a common trope of commentary on Singularitarianism in general.

No less mainstream a work than Deus Ex, for example, quotes Voltaire's famous ""if God did not exist, it would be necessary to create him" in one of its endings -- which revolves around granting a friendly (but probably not Friendly) AI control over the world's computer networks.

Comment author: Jayson_Virissimo 15 December 2012 10:28:21AM *  0 points [-]

No less mainstream a work than Deus Ex, for example, quotes Voltaire's famous ""if God did not exist, it would be necessary to create him" in one of its endings -- which revolves around granting a friendly (but probably not Friendly) AI control over the world's computer networks.

ROT-13:

Vagrerfgvatyl, va gur raqvat Abeantrfg ersref gb, Uryvbf (na NV) pubbfrf gb hfr W.P. Qragba (gur cebgntbavfg jub fgvyy unf zbfgyl-uhzna cersreraprf) nf vachg sbe n PRI-yvxr cebprff orsber sbbzvat naq znxvat vgfrys (gur zretrq NV naq anab-nhtzragrq uhzna) cuvybfbcure-xvat bs gur jbeyq va beqre gb orggre shysvyy vgf bevtvany checbfr.

Comment author: mrglwrf 09 December 2012 08:35:17PM 0 points [-]

Why would you believe that something is always the solution when you already have evidence that it doesn't always work?

Comment author: [deleted] 09 December 2012 08:51:22PM 1 point [-]

Let's go to the object level: in the case of God, the fact that god is doing nothing is not evidence that Friendly AI won't work.

In the case of EY the supposed benevolent dictator, the fact that he is not doing any benevolent dictatoring is explained by the fact that he has many other things that are more important. That prevents us from learning anything about the general effectiveness of benevolent dictators, and we have to rely on the prior belief that it works quite well.

Comment author: prase 09 December 2012 05:30:43PM 8 points [-]

LW needs a king.

LW as a place to test applied moldbuggery, right?

Communities and democratic methods suck at doing the kind of strategic, centralized, coherent decision making that we really need.

Kings also suck at it, in the average. Of course, if we are lucky and find a good king... the only problem is that king selection is the kind of strategic decision humans suck at.

Comment author: [deleted] 09 December 2012 06:12:20PM 7 points [-]

They should be self-selected, then we don't have to rely on the community at large.

There's this wonderful idea called "Do-ocracy" where everyone understands that the people actually willing to do things get all the say as to what gets done. This is where benevolent dictators like Linus Torvalds get there power.

Our democratic training has taught us to think this idea is a recipe for totalitarian disaster. The thing is, even if the democratic memplex were right in it's injunction against authority, a country and an internet community are entirely different situations.

In a country, if you had king-power, you have military and law power as well, and can physically coerce people to do what you want. There is enough money and power at stake to make it so most of the people who want to do the job are in it for the money and power, not the public good. Thus measures like heritable power (at least you're not selecting for power-hunger), and democracy (now we're theoretically selecting for public support).

On the other hand, in a small artificial community like a meetup, a hackerspace, or lesswrong, there is no military to control, the banhammer is much less power than the noose or dungeon, and there is barely anything to gain by embezzling taxes (as a meetup organizer, I could embezzle about $30 a month...). At worst, a corrupt monarch could ban all the good people and destroy the community, but the incentive do do damage to the community is roughly "for the lulz". Lulz is much cheaper elsewhere. The amount of damage is highly limited by the fact that, in the absence of military power, the do-ocrat's power over people is derived from respect, which would rapidly fall off if they did dumb things. On the other hand, scope insensitivity makes the apparent do-gooder motivation just as high. So in a community like this, most of the people willing to do the job will be those motivated to do public good and those agenty enough to do it, so self-selection (do-ocracy) works and we don't need other measures.

Comment author: prase 12 December 2012 03:56:41PM *  5 points [-]

There's this wonderful idea called "Do-ocracy" where everyone understands that the people actually willing to do things get all the say as to what gets done. ... Our democratic training has taught us to think this idea is a recipe for totalitarian disaster.

I can't speak for your democratic training, but my democratic training has absolutely no problem with acknowledging merits and giving active people trust proportional to their achievements and letting them decide what more should be done.

It has become somewhat fashionable here, in the Moldbuggian vein, to blame community failures on democracy. But what particular democratic mechanisms have caused the lack of strategic decisions on LW? Which kind of decisions? I don't see much democracy here - I don't recall participating in election, for example, or voting on a proposed policy, or seeing a heated political debate which prevented a beneficial resolution to be implemented. I recall recent implementation of the karma penalty feature, which lot of LWers were unhappy about but was put in force nevertheless in a quite autocratic manner. So perhaps the lack of strategic decisions is caused by the fact that

  • there just aren't people willing to even propose what should be done
  • nobody has any reasonable idea what strategic decision should be made (it is one thing to say what kind of decisions should be made - e.g. "we should choose an efficient site design", but a rather different thing to make the decision in detail - e.g. "the front page should have a huge violet picture of a pony on it")
  • people aren't willing to work for free

Either of those has little to do with democracy. I am pretty sure that if you volunteer to work on whichever of your suggestions (contacting meetup organisers, improving the site design...), nobody would seriously object and you would easily get some official status on LW (moderator style). To do anything from the examples you have mentioned you wouldn't need dictatorial powers.

Comment author: Nominull 09 December 2012 06:30:49PM *  3 points [-]

The power of the banhammer is roughly proportional to the power of the dungeon. If it seems less threatening, it's only because an online community is generally less important to people's lives than society at large.

A bad king can absolutely destroy an online community. Banning all the good people is actually one of the better things a bad king can do, because it can spark an organized exodus, which is just inconvenient. But by adding restrictions and terrorizing the community with the threat of bans, a bad king can make the good people self-deport. And then the community can't be revived elsewhere.

Comment author: [deleted] 09 December 2012 08:02:24PM 6 points [-]

At worst, a corrupt monarch could ... destroy the community, but the incentive do do damage to the community is roughly "for the lulz". Lulz is much cheaper elsewhere.

I admit, I have seen braindead moderators tear a community apart (/r/anarchism for one).

I have just as often seen lack of moderation prevent a community from becoming what it could. (4chan (though I'm unsure whether 4chan is glorious or a cesspool))

And I have seen strong moderation keep a community together.

The thing is, death by incompetent dictator is much more salient to our imaginations than death by slow entropy and september-effects. incompetent dictators have a face which makes us take it much more seriously than an unbiased assessment of the threats would warrant.

Comment author: Vaniver 09 December 2012 09:27:46PM 2 points [-]

The power of the banhammer is roughly proportional to the power of the dungeon. If it seems less threatening, it's only because an online community is generally less important to people's lives than society at large.

There's a big difference between exile and prison, and the power of exile depends on the desirability of the place in question.

Comment author: [deleted] 09 December 2012 02:13:52PM 6 points [-]

LW needs a king.

Why “king” rather than “monarch”? Couldn't a queen do that?

Comment author: Luke_A_Somers 10 December 2012 01:55:49PM 5 points [-]

Maybe "Princess" would be best, considering everything.

Comment author: [deleted] 10 December 2012 02:08:42PM 1 point [-]

hmmm.. no It definitely has to be a word that implies current authority, not future authority.

Comment author: Luke_A_Somers 10 December 2012 02:39:27PM *  2 points [-]

There is a particular princess in the local memespace with nigh-absolute current authority.

edited to clarify: by 'local memespace' I mean the part of the global memespace that is in use locally, not that there's something we have going that isn't known more broadly

Comment author: [deleted] 10 December 2012 02:50:15PM 1 point [-]

I am getting this "whoosh" feeling but I still can't see it.

Comment author: Luke_A_Somers 10 December 2012 05:03:04PM *  5 points [-]

If you image-search 'obey princess', you will get a hint. Note, the result is... an alicorn.

But more seriously (still not all that seriously), there would be collossal PR and communication disadvantages given by naming a king, that would be mostly dodged by naming a princess.

In particular, people would probably overinterpret king, but file princess under 'wacky'. This would not merely dodge, but could help against the 'cold and calculating' vibe some people get.

Comment author: Kindly 10 December 2012 06:43:58PM 1 point [-]

Luke_A_Somers is referring to Princess Dumbledore, from Harry Potter and the Methods of Rationality, chapter 86.

Comment author: Luke_A_Somers 10 December 2012 09:39:19PM 0 points [-]

I'd love to read that chapter!

Comment author: Zack_M_Davis 10 December 2012 06:06:08PM *  1 point [-]

(Almost certainly a reference to the animated series My Little Pony: Friendship Is Magic, in which Princess Celestia rules the land of Equestria.)

Comment author: [deleted] 10 December 2012 02:43:12PM 1 point [-]

Let's just say BDFL (Benevolent Dictator For Life)...

Comment author: Luke_A_Somers 10 December 2012 06:30:00PM 1 point [-]

Insufficiently wacky - would invite accusations of authoritarianism/absolutism from the clue impaired.

Comment author: faul_sname 09 December 2012 11:12:17PM 13 points [-]

Yes, and a queen could move more than one space in a turn, too.

Comment author: J_Taylor 09 December 2012 05:34:53PM 8 points [-]

For obvious decision theoretic reasons, a king is necessary. However, the king does not have to be a man.

Comment author: [deleted] 09 December 2012 05:45:30PM 3 points [-]

"CEO" could work. I just like the word "king". a queen would do just as well.

Comment author: pleeppleep 09 December 2012 02:19:33PM 2 points [-]

Now you're just talking crazy.

Comment author: DanArmak 09 December 2012 05:54:45PM 1 point [-]

The queen's duty is to secure the royal succession!

Comment author: Epiphany 09 December 2012 11:54:25PM *  -2 points [-]

I don't think a CEO level monarch is necessary though I don't know what job title a community "gardener" would map to. Do you think a female web developer who obviously cares a lot about LW and can implement solutions would be a good choice?

This doesn't look like it's very likely to happen though, considering that they're changing focus:

For 12 years we've largely focused on movement-building through the Singularity Summit, Less Wrong, and other programs... But in 2013 we plan to pivot so that a much larger share of the funds we raise is spent on research.

Then again maybe CFAR will want to do something.

Comment author: Curiouskid 19 December 2012 03:22:19AM *  1 point [-]

I think you meant to use a different hyperlink?

Comment author: Epiphany 19 December 2012 04:12:56AM 1 point [-]

It has been fixed. Thanks, Curiouskid!

Comment author: [deleted] 10 December 2012 12:14:33AM *  0 points [-]

female web developer who obviously cares a lot about LW and can implement solutions would be a good choice?

Female doesn't matter, web development is good for being able to actually write what needs to be written. Caring is really good. The most important factor though is willingness to be audacious, grab power, and make things happen for the better.

Whether or not we need someone with CEO-power is uninteresting. I think such a person having more power is good.

If you're talking about yourself, go for it. Get a foot in the code, make the front page better, be audacious. Make this place awesome.

I've said before in the generic, but in this case we can be specific: If you declare yourself king, I'll kneel.

(good luck)

Comment author: Alicorn 10 December 2012 01:01:27AM *  10 points [-]

If you're talking about yourself, go for it. Get a foot in the code, make the front page better, be audacious. Make this place awesome.

I'm opposed to appointing her as any sort of actual-power-having-person. Epiphany is a relative newcomer who makes a lot of missteps.

Comment author: [deleted] 10 December 2012 05:39:13AM *  2 points [-]

I agree that appointing her would be a bad idea.

I see no problem with encouraging people (in this case, her) to become the kind of person we should appoint.

Comment author: wedrifid 10 December 2012 01:21:47AM 1 point [-]

I'm opposed to appointing her as any sort of actual-power-having-person.

The personal antipathy there has been distinctly evident to any onlookers who are mildly curious about how status and power tends to influence human behavior and thought.

Comment author: Alicorn 10 December 2012 06:41:17AM *  2 points [-]

I think anyone with any noticeable antipathy between them and any regular user should not have unilateral policymaking power, except Eliezer if applicable because he was here first. (This rules me out too. I have mod power, but not mod initiative - I cannot make policy.)

Comment author: wedrifid 10 December 2012 09:58:47AM 0 points [-]

I think anyone with any noticeable antipathy between them and any regular user should not have unilateral policymaking power, except Eliezer if applicable because he was here first. (This rules me out too. I have mod power, but not mod initiative - I cannot make policy.)

I agree and note that it is even more important that people with personal conflicts don't have the power (or, preferably, voluntarily waive the power) to actively take specific actions against their personal enemies.

(Mind you, the parent also seems somewhat out of place in the context and very nearly comical given the actual history of power abuses on this site.)

Comment author: Epiphany 10 December 2012 12:27:59AM 0 points [-]

Female doesn't matter, web development is good for being able to actually write what needs to be written. Caring is really good. The most important factor though is willingness to be audacious, grab power, and make things happen for the better.

Well I do have the audacity.

If you're talking about yourself, go for it. Get a foot in the code, make the front page better, be audacious. Make this place awesome.

I would love to do that, but I've just gotten a volunteer offer for a much larger project I had an idea for. I had been hoping to do a few smaller projects on LW in the meantime, while I was putting some things together to launch my larger projects, and the timing seems to have worked out such that I will be doing the small projects while doing the big projects. In other words, my free time is projected to become super scarce.

However, if a job offer were presented to me from LessWrong / CFAR I would seriously consider it.

If you declare yourself king, I'll kneel.

I don't believe in this. I am with Eliezer on sentiments like the following:

In Two More Things to Unlearn from School he warns his readers that "It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism."

In Cached Thoughts he tells you to question what HE says. "Now that you've read this blog post, the next time you hear someone unhesitatingly repeating a meme you think is silly or false, you'll think, "Cached thoughts." My belief is now there in your mind, waiting to complete the pattern. But is it true? Don't let your mind complete the pattern! Think!"

But thank you. (:

Comment author: [deleted] 10 December 2012 12:54:29AM 3 points [-]

I would love to do that, but I've just gotten a volunteer offer for a much larger project I had an idea for. I had been hoping to do a few smaller projects on LW in the meantime, while I was putting some things together to launch my larger projects, and the timing seems to have worked out such that I will be doing the small projects while doing the big projects. In other words, my free time is projected to become super scarce.

grumble grumble. Like I said, everyone who could is doing something else. Me too.

However, if a job offer were presented to me from LessWrong / CFAR I would seriously consider it.

I don't think they'll take the initiative on this. Maybe you approach them?

I don't believe in this. I am with Eliezer on sentiments like the following:

<the following>

I don't see how those relate.

But thank you.

Thank you for giving a shit about LW, and trying to do something good. I see that you're actively engaging in the discussions in this thread and that's good. So thanks.

Comment author: Epiphany 11 December 2012 02:15:22AM *  2 points [-]

grumble grumble. Like I said, everyone who could is doing something else. Me too.

Yeah. Well maybe a few of us will throw a few things at it and that'll keep it going...

I don't think they'll take the initiative on this. Maybe you approach them?

I mentioned a couple times that I'm dying to have online rationality training materials and that I want them badly enough I am half ready to run off and make them myself. I said something like "I'd consider doing this for free or giving you a good deal on freelance depending on project size". Nobody responded.

I don't see how those relate.

Simply put: I'm not the type that wants obedience. I'm the type that wants people to think for themselves.

Thank you for giving a shit about LW, and trying to do something good. I see that you're actively engaging in the discussions in this thread and that's good. So thanks.

Aww. I think that's the first time I've felt appreciated for addressing endless September. (: feels warm and fuzzy

Comment author: [deleted] 12 December 2012 04:17:07AM 0 points [-]

Simply put: I'm not the type that wants obedience. I'm the type that wants people to think for themselves.

Please allow me to change your mind. I am not the type who likes obedience either. I agree that thinking for selves is good, and that we should encourage as much of it as possible. However, this does not negate the usefulness of authority:

Argument 1:

Life is big. Bigger than the human mind can reasonable handle. I only have so much attention to distribute around. Say I'm a meetup participant. I could devote some attention to monitoring LW, the mailing list, etc until a meetup was posted, then overcome the activation energy to actually go. Or, the meetup organizer could mail me and say "Hi Nyan, come to Xday's meetup", then I just have to go. I don't have to spend as much attention on the second case, so I have more to spend on thinking-for-myself that matters, like figuring out whether the mainstream assumptions about glass are correct.

So in that way, having someone to tell me what to think and do reduces the effort I have to spend on those things, and makes me more effective at the stuff I really care about. So I actually prefer it.

Argument 2:

Even if I had infinite capacity for thinking for myself and going my own way, sometimes it just isn't the right tool for the job. Thinking for myself doesn't let me coordinate with other people, or fit into larger projects, or affect how LW works, or many other things. If I instead listen to some central coordinator, those things become easy.

So even if I'm a big fan of self-sufficiency and skepticism, I appreciate authority where available. Does this make sense?

Replies to downvoted comments blah blah blah

Perhaps we should continue this conversation somewhere more private... /sleaze

PM me if you want to continue this thread.

Comment author: Epiphany 12 December 2012 07:15:50AM *  1 point [-]

Please allow me to change your mind. I am not the type who likes obedience either.

Well that is interesting and unexpected.

Argument 1:

This seems to be more of a matter of notification strategies - one where you have to check a "calendar" and one where the "calendar" comes to you. I am pattern-matching the concept "reminder" here. It seems to me that reminders, although important and possibly completely necessary for running a functional group, would be more along the lines of a behavioral detail as opposed to a fundamental leadership quality. I don't know why you're likening this to obedience.

Even if I had infinite capacity for thinking for myself

We do not have infinite capacity for critical thinking. True. I don't call trusting other people's opinions obedience. I call it trust. That is rare for me. Very rare for anything important. Next door to trust is what I do when I'm short on time or don't have the energy: I half-ass it. I grab someone's opinion, go "Meh, 70% chance they're right?" and slap it in.

I don't call that obedience, either.

I call it being overwhelmingly busy.

Thinking for myself doesn't let me coordinate with other people, or fit into larger projects, or affect how LW works, or many other things. If I instead listen to some central coordinator, those things become easy.

Organizing trivial details is something I call organizing. I don't call it obedience.

When I think of obedience I think of that damned nuisance demand that punishes me for being right. This is not because I am constantly right - I'm wrong often enough. I have observed, though, that some people are more interested in power than in wielding it meaningfully. They don't listen and use power as a way to avoid updating (leading them to be wrong frequently). They demand this thing "obedience" and that seems to be a warning that they are about act as if might makes right.

My idea of leadership looks like this:

  • If you want something new to happen, do it first. When everyone else sees that you haven't been reduced to a pile of human rubble by the new experience, they'll decide the "guinea pig" has tested it well enough that they're willing to try it, too.

  • If you really want something to get done, do it your damn self. Don't wait around for someone else to do it, nag others, etc.

  • If you want others to behave, behave well first. After you have shown a good intent toward them, invite them to behave well, too. Respect them and they will usually respect you.

  • If there's a difficulty, figure out how to solve it.

  • Give people something they want repeatedly and they come back for it.

  • If people are grateful for your work, they reciprocate by volunteering to help or donating to keep it going.

To me, that's the correct way of going about it. Using force (which I associate with obedience) or expecting people not to have thoughts of their own is not only completely unnecessary but pales in comparison effectiveness-wise.

Maybe my ideas about obedience are completely orthogonal to yours. If you still think obedience has some value I am unaware of, I'm curious about it.

if you want to continue...

Thank you for your interest. It feels good.

I have a romantic interest right now who, although we have not officially deemed our status a "relationship" are considering one another as potential seriously partners.

This came to both of us as a surprise. I had burned out on dating and deleted my dating profile. I was like:

insane amount of dating alienation * ice cube's chance of finding compatible partner > benefits of romance

(Narratives by LW Women thread if you want more)

And so now we're like ... wow this amount of compatibility is special. We should not waste the momentum by getting distracted by other people. So we decided that in order to let the opportunity unfold naturally, we would avoid pursuing other serious romantic interests for now.

So although I am technically available, my expected behavior, considering how busy I am, would probably be best classified as "dance card full".

Comment author: [deleted] 12 December 2012 01:43:44PM *  1 point [-]

We seem to have different connotations on "obedience", and might be talking about slightly different concepts. You're observations about how most people use power, and the bad kind of obedience, are spot-on.

The topic came up because of the "I'd kneel to anyone who declared themselves king" thing. I don't think such a behaviour pattern has to go to bad power abusing obedience and submission. I think it's just a really strategically useful thing to support someone who is going to act as the group-agency. You seem to agree on the important stuff and we're just using different words. case closed?

romantic.

lol what? Either you or me has utterly misunderstood something because I'm utterly confused. I made a mock-sleazy joke about the goddam troll toll, and suggested that we wouldn't have to pay it but we could still discuss if we PMed instead. And then suddenly this romantic thing. OhgodwhathaveIdone.

It feels good.

That's good. :)

Comment author: Epiphany 12 December 2012 06:14:04PM *  1 point [-]

You seem to agree on the important stuff and we're just using different words. case closed?

Yeah I think the main difference may be that I am very wary of power abuse, so I avoid using terms like "obedience" and "kneeling" and "king" and choose other terms that imply a situation where power is balanced.

lol what? Either you or me has utterly misunderstood something

Sorry, I think I must have misread that. I've been having problems sleeping lately. If you want to talk in PM to avoid the troll toll go ahead.

That's good. :)

Well not anymore. laughs at self

Comment author: Kindly 09 December 2012 03:51:43PM 8 points [-]

"LessWrong has lost 52% of it's giftedness since March of 2009" is an incredibly sensationalist way of describing a mere 7-point average IQ drop. Especially if the average is dropping due to new users, because then the "giftedness" isn't actually being lost.

Comment author: Nominull 09 December 2012 06:24:15PM 5 points [-]

Well, I agree, but "mere" probably isn't a sensationalist enough way to describe a 7 point drop in IQ.

Comment author: Kindly 09 December 2012 07:49:56PM 2 points [-]

Okay, I agree, maybe that was pushing it a little.

Comment author: [deleted] 13 December 2012 01:42:07PM 1 point [-]

Honestly, so long as the drop is due to lower-IQ people arriving rather than higher-IQ people leaving, I can't see why it's such a big deal -- especially if the “new” people mostly just lurk. Now, if the average IQ of only the people with > 100 karma in the last 30 days was also dropping with time...

Comment author: gwern 09 December 2012 07:00:11PM 4 points [-]

Some absolute figures:

R> lw2009 <- read.csv("2009.csv"); lw2011 <- read.csv("2011.csv"); lw2012 <- read.csv("2012.csv")
R>
R> sum(as.integer(as.character(lw2009$IQ)) > 140, na.rm=TRUE)
[1] 31
R> sum(as.integer(as.character(lw2011$IQ)) > 140, na.rm=TRUE)
[1] 131
R> sum(as.integer(as.character(lw2012$IQ)) > 140, na.rm=TRUE)
[1] 120
R>
R> sum(as.integer(as.character(lw2009$IQ)) > 150, na.rm=TRUE)
[1] 20
R> sum(as.integer(as.character(lw2011$IQ)) > 150, na.rm=TRUE)
[1] 53
R> sum(as.integer(as.character(lw2012$IQ)) > 150, na.rm=TRUE)
[1] 42
Comment author: RichardKennaway 10 December 2012 02:45:46PM 4 points [-]

So, what's needed is a division into an introductory place for anyone to join and learn, and a "graduate-level" place for people with serious ability and commitment to making stuff happen. The latter wouldn't be a public forum, in fact it wouldn't even be a forum at all, even if as part of its activities it has one. It would be an organisation founded for the purpose of promulgating rationality and improving that of its members, not merely talking about it. It would present itself to the world as such, with participation by invitation or by application rather than just by signing in on a web site.

In other words, LessWrong and CFAR.

Comment author: Vaniver 09 December 2012 09:32:01PM 2 points [-]

I think that LW would be better with more good content. (Shocking!) I like plans that improve the amount of visible good content on LW, and am generally ambivalent towards plans that don't have that as an explicit goal.

My preferred explanation for why LW is less fun than it was before: Curiosity seeks to annihilate itself, and the low-hanging fruits have been picked. At one point, I was curious about what diet I should follow; I discovered intermittent fasting, tried it out, and it worked well for me. I am now far less curious about what diet I should follow. Similarly, LW may have gone through its rush of epiphanies, and now there is only slow, steady progress. The social value of LW (which is where Eternal September has its primary effects) is also moving towards meetups, which is probably better at fulfilling the social needs of meetup members but fractures the community and drains from the site.

Are there good topics out there that people haven't written posts on? I think so, and there are a few subjects that I know about and that I'm writing posts / sequences about. But they are little acorns, not big oaks. I believe that's necessary, but it will not look the same to readers.

Comment author: [deleted] 10 December 2012 03:11:11PM 3 points [-]

Good points, but a bucket of picked fruit does not make a pie.

We've generated a lot of really valuable insight on this site, but right now it has no structure to it.

Maybe it's time to move from an article-writing phase to a knowledge-organization phase.

Comment author: Epiphany 11 December 2012 02:53:57AM 1 point [-]

I had been thinking that, too. Some people had mentioned argument mapping software, however I have heard some really harsh criticisms of those. Not sure if that's the right way.

Maybe a karma-infused wiki (alluding to Luke's recent post).

Comment author: [deleted] 12 December 2012 04:20:41AM 1 point [-]

A textbook style overview/survey of LW rationality would be pretty awesome.

Comment author: David_Gerard 10 December 2012 12:25:16AM 6 points [-]

As was noted previously: the community is probably doomed. I suspect all communities are - they have a life cycle.

Even if a community has the same content, the people within it change with time.

The essential work on the subject is Clay Shirky's A Group Is Its Worst Enemy. It says at some point it's time for a wizard smackdown, but it's not clear this helps - it also goes from "let's see what happens" development to a retconned fundamentalism, where the understood implicit constitution is enforced. This can lead to problems if people have different ideas on what the understood implicit constitution actually was.

I also think this stuff is constant because Mark Dery's Flame Wars discussed the social structure of online groups (Usenet, BBSes) in detail in 1994, and described the Internet pretty much as it is now and has been since the 1980s.

tl;dr people are a problem.

Comment author: Larks 10 December 2012 11:08:50PM 3 points [-]

Without some measure of who the respondants are, this survey can't mean much. If the recent arrivals vote on mass that there is no problem, the poll will suggest there isn't any, even though Eternal September is the very mechanism that causes the poll outcome! For the same reason that sufficiently large immigration becomes politically impossible to reverse, so too Eternal September cannot be combatted democractically.

To get a more accurate response, we'd have to restrict it to people who had more than 100 karma 12 months ago or something.

Comment author: SoftFlare 09 December 2012 11:40:32AM 3 points [-]

We might want to consider methods of raising standards for community members via barriers of entry employed elsewhere (Either for posting, getting at some or all the content, or even hearing about the site's existance):

  • An application process for entry (Workplaces (ie Valve), MUD sites)
  • Regulating influx using a member cap (Torrent sites, betas of web products)
  • An activity standard - You have to be atleast this active to maintain membership (Torrent sites, task groups in organizations sometimes)
  • A membership fee - Maybe in conjuction with an activity standard - (Torrent sites, private online communities, private real-world communities, etc. etc.)
  • Allowing membership only by invitation/sponsorship - (Torrent sites, US Citizenship, Law firms partnership, The Bavarian Illuminati)
  • Having a section of the site be a secret, only to be revealed to people who have proven themselves, A-la bayesian conspiracy - (How classified intelligence organizations work sometimes (not a joke), Internet forums)
  • Karma-based feature upgrading (Stack Exchange, Internet Forums)

Or any combination of the above applied to different sections. If anyone would like to pursue this, I am willing to spend up to 2 hours a week for the next few weeks constructing a solid plan around this, given someone else is willing to commit at least similar resources.

On a different note, and as anecdotal evidence, I have been lurking on LW for years now, and went to a CFAR camp before posting a single comment - In fear of karma retribution and trolling. (I know that its a bad strategy and that I shouldn't care as much. Sadly, I'm not as good at self-modification as I would like to be sometimes.)

Comment author: Eugine_Nier 09 December 2012 06:54:48PM 5 points [-]

On a different note, and as anecdotal evidence, I have been lurking on LW for years now, and went to a CFAR camp before posting a single comment - In fear of karma retribution and trolling. (I know that its a bad strategy and that I shouldn't care as much. Sadly, I'm not as good at self-modification as I would like to be sometimes.)

On the other had, lurking for a while before posting is very much what we want new users to do.

Comment author: Oscar_Cunningham 09 December 2012 10:11:42AM 1 point [-]

I think we just need people to downvote more. Perhaps we could insist that you downvote one thing for every three things that you upvote?

Comment author: Decius 09 December 2012 09:45:51PM 7 points [-]

Weight each member's upvotes in a manner determined by the proportion of their votes in each direction and their total karma.

Comment author: Luke_A_Somers 10 December 2012 01:53:25PM 1 point [-]

This takes output as input. Would you go with a self-consistent result, make it time-inconsistent, or cut it off at one tier?

A better solution, I think, would be to weight the karma change by the 'information' that this new vote provided if the order of votes was irrelevant - i.e. multiply it by min(1, log2(P)) with P being the fraction of the voter's votes that are of that vote type. So if Bob likes Alice's post when Bob likes everyone's posts, Alice doesn't get much from it. If Bob like's Alice's post when Bob likes half or fewer of all posts he votes on, Alice gets 1 full karma from it.

Comment author: Decius 10 December 2012 05:01:11PM 0 points [-]

Time-inconsistent. Nothing you do after you upvote should change the results. This risks having karma determined almost exclusively by how many posts are upvoted by the top elite.

Perhaps a better solution could be found if we could establish all of the goals of the karma system. Why do we track users' total karma?

Comment author: RobertLumley 09 December 2012 12:40:14AM 1 point [-]

It's almost like as we grow in number we regress towards the mean. Shocking.

Comment author: jmmcd 09 December 2012 02:05:51AM 4 points [-]

If trending towards the mean wasn't explicitly mentioned in the poll this would be a useful contribution. As it stands, you should pay a lol toll.

Comment author: [deleted] 09 December 2012 04:43:45AM 1 point [-]

Option 1: Close the borders. It's unfortunate that the best sort might be kept out, while its guaranteed the rest will be kept out. The best can found / join other sites, and LW can establish immigration policies after a while.

Option 2. Builds. Freeze LW at a stage of development, then have a new build later. Call this one LW 2012, and nobody can join for six months, and we're intent on topics X Y and Z. Then for build 2013 there are some vacancies (based on karma?) for a period of time, and we're intent on topics X Q and R.

Option 3: Expiration date. No matter how good or bad it gets, on date N it closes shop. Then it is forked into the This-ists and the Those-ians with a few Whatever-ites that all say they carry the torch and everybody else is more wrong.

Comment author: Nornagest 09 December 2012 05:10:06AM *  8 points [-]

In my experience, which admittedly comes from sites quite different from LW, an Internet project running on volunteer contributions that's decided to keep new members from productive roles has a useful lifetime of no more than one to two years. That's about how long it takes for everybody to voice their pet issues, settle their feuds, and move on with their lives; people may linger for years afterwards, but at that point the vital phase is over. This can be stretched somewhat if there's deep factional divisions within the founding population -- spite is a pretty good motivator -- but only at the cost of making the user experience a lot more political.

It's also worth bearing in mind that demand for membership in a forum like this one is continuous and fairly short-term. Accounts offer few easily quantifiable benefits to begin with, very few if the site's readable to non-members, so we can't rely on lasting ambitions to participate; people sign up because they view this forum as an attractive place to contribute, but the Internet offers no shortage of equivalent niches.

Comment author: NancyLebovitz 10 December 2012 02:04:45PM 2 points [-]

Added for completeness (I'm not sure immigration restrictions are a good idea): Have an invitation system.

Comment author: dbaupp 10 December 2012 12:47:13AM 2 points [-]

Option 1: Close the borders. It's unfortunate that the best sort might be kept out, while its guaranteed the rest will be kept out. The best can found / join other sites, and LW can establish immigration policies after a while.

This isn't so ridiculous in short bursts. I know that Hacker News disables registration if/when they get large media attention to avoid a swathe of new only-mildly-interested users. A similar thing could happen here. (It might be enough to have an admin switch that just puts a display: hidden into the CSS for the "register" button; trivial inconveniences and all.)

Comment author: DanArmak 09 December 2012 05:56:42PM *  1 point [-]

LW can establish immigration policies after a while.

Since this started out with people complaining about others already here, we might be called upon to create an immigration police. You sir, show me your +500 karma badge!

Comment author: [deleted] 12 February 2013 01:02:59PM *  1 point [-]

I really don't see why Epiphany is so obsessed with IQ. Based on anecdotal evidence, there is not much of a correlation between IQ and intellect beyond the first two standard deviations above the mean anyway. I have come across more than a handful of people who don't excel in traditional IQ tests, but who are nevertheless very capable of presenting coherent, well-argued insights. Does it matter to me that their IQ is 132 instead of 139? No. Who cares about the average IQ among members of the LW community as long as we continue demonstrating the ability to engage in thoughtful discussions and generate valuable conclusions?

It is also possible to inflate your IQ score by taking tests repeatedly. "One meta-analysis reports that a person who scores in the 50th percentile on their first test will be to the 80th by their third", according to this page: http://rationalwiki.org/wiki/High_IQ_society. If you are vain and think that doing well on an IQ test is a really important way of signalling intellect, then go ahead and keep doing exercises in Mensa practice books, though that would not make you more capable of critical thinking or logical argumentation.

Comment author: Epiphany 13 February 2013 02:35:07AM *  0 points [-]

I have come across more than a handful of people who don't excel in traditional IQ tests, but who are nevertheless very capable of presenting coherent, well-argued insights. Does it matter to me that their IQ is 132 instead of 139? No. Who cares about the average IQ among members of the LW community as long as we continue demonstrating the ability to engage in thoughtful discussions and generate valuable conclusions?

Another possibility here is that your perceptions of intelligence levels are really off. This isn't too unlikely as I see it:

I've heard reports that people with super high IQs have trouble making distinctions between normal and bright, or even between moderately gifted and mentally challenged. I frequently observe that the gifted people I've met experience their own intelligence level as normal, and accidentally mistake normal people for stupid ones, or mistakenly interpret malice when only ignorance is present (because they're assuming the other person is as smart as they are and would therefore never make such an ignorant mistake).

If the intelligence difference you experience every day is 70 points wide, your perceptions are probably more geared to find some way to make sense of conflicting information, not geared to be sensitive to ten point differences.

As a person who has spent a lot of time learning about intelligence differences, I'd say it's fairly hard to perceive intelligence differences smaller than 15 points anyway. The 30 point differences are fairly easy to spot. A large part of this may be because of the wide gaps in abilities that gifted people tend to have between their different areas of intelligence. So, you've got to figure that IQ 130 might be an average of four abilities that are quite different from each other, and so the person's abilities will likely overlap with some of the abilities of a person with IQ 120 or IQ 140. However, a person with an IQ of 160 will most likely have their abilities spread out across a higher up range of ability levels, so they're more likely to seem to have completely different abilities from people who have IQs around 130.

The reason why a few points of difference is important in this context is because the loss appears to be continuing. If we lose a few points each year, then over time, LessWrong would trend toward the mean and the culture here may die as a result.

Comment author: [deleted] 13 February 2013 01:21:10PM 1 point [-]

or mistakenly interpret malice when only ignorance is present (because they're assuming the other person is as smart as they are and would therefore never make such an ignorant mistake)

I'm under the impression that a substantial part of Hanson's Homo hypocritus observations fall prey to this failure mode.

Comment author: Epiphany 13 February 2013 06:15:10PM 1 point [-]

Is there a name for this failure mode? For clarity: The one where people use themselves as a map of other people and are frequently incorrect. That would be good to have.

Comment author: Vladimir_Nesov 13 February 2013 06:16:32PM 3 points [-]
Comment author: [deleted] 13 February 2013 09:34:47AM *  0 points [-]

Sorry about my tardiness when responding to comments. I don't visit LessWrong very often. Maybe in future I should refrain from posting comments unless I am sure that I have the time and diligence to participate satisfactorily in any discussion that my comments might generate, since I wouldn't want to come across as rude.

Another possibility here is that your perceptions of intelligence levels are really off.

After reading and thinking a bit about this comment, I think you might be right, especially regarding the point that gifted people might often

mistakenly interpret malice when only ignorance is present.

I am rather bad at reading other people. I am not diagnosed with any degree of autism, but I am rather socially stunted nevertheless. As I mentioned in an earlier comment, I can be socially inept. This self-assessment was the conclusion of many instances where I was informed that I had grossly misunderstood certain social situations or inadvertently committed some kind of faux pas.

It is also generally difficult for me to gauge whether specific comments of mine might be construed as passive-aggressive/condescending. When you asked if my intention was to insult you, my response was "No, but I am sorry that you feel that way". In the past, when I did not know any better, I would have said, "No, and don't be so sensitive." As you can imagine, that response usually escalated things instead of calming people down. It is a long and ongoing learning process for me to understand how to react appropriately in social contexts in order to avoid hurt feelings.

In short, it seems like I commit the mind projection fallacy a lot when interacting with other people: If I wouldn't feel offended by certain ways of phrasing things, I assume that other people wouldn't either. If I wouldn't make such an ignorant mistake, I assume that other people wouldn't either.

The reason why a few points of difference is important in this context is because the loss appears to be continuing.

When you put it like this, I can understand your concern.

Comment author: [deleted] 13 February 2013 01:22:38PM 0 points [-]

The reason why a few points of difference is important in this context is because the loss appears to be continuing. If we lose a few points each year, then over time, LessWrong would trend toward the mean and the culture here may die as a result.

http://xkcd.com/605/ http://xkcd.com/1007/

(SCNR.)

Comment author: Epiphany 13 February 2013 06:19:38PM *  -1 points [-]

Ok, FYI, if you see the words "appears to be" and "if" in my sentences, it means I am acknowledging the ambiguity. If you do not want to annoy me, please wait until I'm using words like "definitely" and "when" or direct your "could not resist" comments at someone else.

If you want to discuss how we may determine the probability of a consistent and continuing downward trend, that would be constructive and I'd be very interested. Please do not waste my time by pointing out the obvious.

Comment author: [deleted] 13 February 2013 07:26:38PM *  0 points [-]

If you want to discuss how we may determine the probability of a consistent and continuing downward trend, that would be constructive and I'd be very interested. Please do not waste my time by pointing out the obvious.

(First of all, as I might have already mentioned, I don't think that the average of (IQ - 132) over all readers is a terribly interesting metric; the total number of active contributors with IQ above 132 or something like that might be better.)

I'd guess that the decline in average IQ is mostly due to lower-IQ people arriving rather than to higher-IQ people leaving (EDIT: applying the intraocular trauma test to this graph appears to confirm that), and the population growth appears to have tapered off (there were fewer respondents in the 2012 survey than in the 2011 one, even though the 2011 one was open for longer). I'd guess the average IQ of readers is decreasing with time as a reversed logistic function, but we'd have to fit a four-parameter curve to three data points to test that.

Comment author: Epiphany 14 February 2013 03:57:06AM *  0 points [-]

the total number of active contributors with IQ above 132 or something like that might be better

Actually, a similar concern was brought up in response to my IQ Accuracy comment and Vaniver discovered that the average IQs of the active members and lurkers was almost exactly the same:

165 out of 549 responses without reported positive karma (30%) self-reported an IQ score; the average response was 138.44.

181 out of 518 responses with reported positive karma (34%) self-reported an IQ score; the average response was 138.25.

We could separate the lurkers from the active members and do the analysis again, but I'm not sure it would be worth the effort as it looks to me like active members and lurkers are giving similar answers. If you'd like to do that, I'd certainly be interested in any surprises you uncover, but I don't expect it to be worthwhile enough to do it myself.

I'd guess that the decline in average IQ is mostly due to lower-IQ people arriving rather than to higher-IQ people leaving (EDIT: applying the intraocular trauma test to this graph appears to confirm that)

The sample set for the highest IQ groups is, of course, rather small, but what's been happening with the highest IQ groups is not encouraging. The specific graph in question (although I very much doubt that Gwern would intend to make that graph misleading in any way) is just not designed to clearly illustrate that particular aspect of the results visually.

Here are a few things you wouldn't guess without looking at the numbers:

Exceptionally gifted people used to be 18% of the IQ respondents. Now they are 6%.

The total number of highly and exceptionally gifted respondents decreased in 2012, while normal and moderately gifted respondents increased.

I did some analysis here

Comment author: Epiphany 12 February 2013 08:09:49PM *  0 points [-]

I really don't see why Epiphany is so obsessed with IQ. Based on anecdotal evidence, there is not much of a correlation between IQ and intellect beyond the first two standard deviations above the mean anyway.

Try reading this response to Slade's suicidal post and you will begin to understand why giftedness is relevant, in a general sense. Gifted people, especially highly gifted people, are very different from most. If you haven't seen that for yourself, then perhaps:

A. You haven't met someone with an IQ like 160 or 180. Those people tend to be very, very different so maybe you are only comparing people with much smaller IQ differences with each other.

B. The people you've met with super high IQs behave in a way that blends in when they're with you and minimize social contact so that you don't notice the differences. The ones that I know tend to do that. They don't just barge into a room and solve unsolvable science problems for all to see. They tend to be quiet, or away hiding in their caves.

C. You never asked the IQs of the smartest people you know and therefore haven't seen the difference.

D. You feel strongly that we should express egalitarianism by treating everyone as if they are all intellectually exactly the same. There's a movement of people who want to believe everyone is gifted, that giftedness does not exist, that it goes away, or that gifted people have some horrible flaw that "balances" them out, that they should be stifled in schooling environments in order to destroy their giftedness so that they're intellectually equal to everybody else, and all kinds of other things. Many people hate inequality and cannot deal with the scientifically proven fact that intellectual inequalities do exist. Wanting to solve inequalities is great, but it's important that we don't deny that intellectual inequalities exist, and it's absolutely, undeniably wrong to stifle a person, especially a child, in the name of "equality". I care a lot about this cause. I hope you read this PDF by developmental psychologist Linda Silverman (I want everyone to read it):

Myths about the Gifted

I have come across more than a handful of people who don't excel in traditional IQ tests, but who are nevertheless very capable of presenting coherent, well-argued insights.

One in six gifted people has a learning disorder. About one in three are creative. Some of them have mental disorders or physical conditions. All three of these can reduce one's IQ score and should be compensated for on an IQ test. Unfortunately, a lot of the IQ tests that are administered (by Mensa for instance) do not include any sort of evaluation for multiple exceptionalities (jargon for when you've got multiple differences that affect learning).

Who cares about the average IQ among members of the LW community as long as we continue demonstrating the ability to engage in thoughtful discussions and generate valuable conclusions?

You missed my point. My point was: "LessWrong may be headed toward cultural collapse so we need some way to determine whether this is a real threat. Do we have numbers? Yes we do. We have IQ numbers." The IQ blurb was a data point for an ongoing discussion on the controversial yet critical topic of whether LessWrong's subculture is dying. My point was not "Oh no, we cannot lose IQ points!"

Let me ask you this: If you were attempting to determine whether LessWrong is headed for cultural collapse, and you knew that the average IQ at LessWrong was decreasing, and you knew that you needed to supply the group with all related data, would you justify omitting that? You would have to include it if you want to be thorough, as it was related. That point is at the top because it's new - most of the other points have been presented before. I couldn't present the IQ data until it had been thoroughly analyzed.

I'm a psychology enthusiast with a special interest in developmental psychology, specifically in gifted adults. When I go to the trouble of thoroughly analyzing some data and sharing information that I gathered while pursuing a main interest of mine, I very much prefer respectful comments in return such as "I don't see the relevance of IQ in this context, would you mind explaining?" as opposed to being called "obsessed". I prefer it even more if the person double checks their own perceptions to clear up any confusion on their own before responding to me.

I have a passion for learning which is not pathological. The term "obsessed" is inappropriate and offensive. Try this: Gwern, one of LessWrong's most prominent and most appreciated members, also has a passion for learning. Check out his website. If you do not appreciate the thoroughness with which he pursues truth - a major element of LessWrong culture - then perhaps it's time to consider whether this is a compatible hang out spot.

If you are vain and think that doing well on an IQ test is a really important way of signalling intellect, then go ahead and keep doing exercises in Mensa practice books, though that would not make you more capable of critical thinking or logical argumentation.

Was your intent to insult me?

Comment author: [deleted] 12 February 2013 08:20:56PM *  3 points [-]

You haven't met someone with an IQ like 160 or 180. Those people tend to be very, very different so maybe you are only comparing people with much smaller IQ differences with each other.

To the extent that IQ tests are reliable, my IQ is actually measured to be 170 (no re-takes or prior training; assessed by a psychometrician). (Just supplying information here; please don't construe this as an act of defensiveness or showing off, because that is not my intention.) I was also not only comparing people with smaller IQ differences -- I have encountered people with 10+ points of IQ difference and yet who are not significantly different in terms of their abilities to contribute meaningfully to dialogues. But, of course, my sample size is not huge.

Was your intent to insult me?

No, but I am sorry that you feel that way. I can be socially inept.

Comment author: Epiphany 12 February 2013 08:47:23PM *  0 points [-]

To the extent that IQ tests are reliable, my IQ is actually measured to be 170 (no re-takes or prior training). (Just supplying information here; please don't construe this as an act of defensiveness.)

Well that was unexpected. I'm open-minded enough to consider that this is possibly the case.

FYI: Claims like this are likely to trigger a fit of "overconfident pessimism" (referring to Luke's article) in some of the members. IQ appears to be a consistent pessimism trigger.

Was your intent to insult me? No, but I am sorry that you feel that way. I can be socially inept.

Admitting that is big of you. Thanks for that. My subjective faith in humanity indicator has been incremented a tick in the upward direction.

I see you're new, so I'll inform you: There are a lot of people like us here, meaning, people who know better than to game an IQ test and then delude themselves with the "results".

I won't say there are no status games, but that you will find a lot of people that frown on them as much as you appear to in your last comment. I don't even believe in status.

It's really hard to leave the outside world outside. I keep perceiving irrational B.S. everywhere, even though I've been participating here since August. Not going to say that there's no irrational B.S. here or that I haven't adjusted at all but that my perceptions still haven't entirely adjusted.

It appears that you may have a similar issue of perceiving B.S. in comments where no such B.S. exists.

It's best to be aware of such a tendency if you have it, as this kind of response is, for obvious reasons, kind of alienating to others. Not blaming you for it (I have the same problem). Just trying to help.

Now that we've established that there was a misunderstanding here, would you like to start over by choosing and clarifying a point you want to make, or telling me that you've reinterpreted things? That would tie up this loose end of a conversation.

Out of curiosity, do you feel significantly different from those in the IQ 130 range?

Comment author: Vladimir_Nesov 13 February 2013 01:45:45PM 1 point [-]

I'm open-minded enough to consider that this is possibly the case.

This sounds like identity-driven reasoning. (Antipattern: "Do I accept the claim X? I'm open-minded. Open-minded people would accept X. Therefore I accept X.") The conclusions you draw about something should be given by your understanding of that thing, not by your identity.

Comment author: [deleted] 13 February 2013 01:26:17PM 1 point [-]

About one in three are creative.

Isn't creativity a continuum? Such a sentence sounds as weird as “about one in three is tall” to me.

Comment author: FiftyTwo 09 December 2012 02:43:18AM 1 point [-]

I'm curious, how do you propose spreading ideas or raising the sanity waterline without bringing in new people?

Comment author: DanArmak 09 December 2012 05:58:01PM 7 points [-]

If you want to spread ideas, don't bring outsiders in, send missionaries out.

Comment author: [deleted] 10 December 2012 04:32:49AM *  3 points [-]

Missionaries will nearly inevitably mention LessWrong, which will still attract people to the site (even if they don't stay around for long.)

Comment author: Larks 10 December 2012 12:45:47PM 0 points [-]

People can have read-only access.

Comment author: Eugine_Nier 09 December 2012 03:51:00AM *  4 points [-]

On the other hand if the new people dilute or overwhelm LW culture and lower the sanity waterline on LW, it won't be able to raise the sanity waterline in the rest of the world. It's a balancing act.

Comment author: Epiphany 09 December 2012 02:56:15AM *  2 points [-]

Firstly, it is not my view that we should not bring in new people. My view is that if we bring in too many new people at once, it will be intolerable for the old users and they will leave. That won't raise the sanity waterline as effectively as growing the site at a stable pace.

Secondly, the poll has an option "Send beginners to the Center for Applied Rationality" (spelled "Modern" not "Applied" in the poll because I was unaware that CFMR changed it's name to CFAR).

Comment author: Epiphany 08 December 2012 11:42:32PM *  0 points [-]

Endless September Poll:

I condensed the feedback I got in the last few threads into a summary of pros and cons of each solution idea if you would like to something for reference.


How concerned should we be about LessWrong's culture being impacted by:

...overwhelming user influx?

...trending toward the mean?

...some other cause?

(Please explain the other causes in the comments.)


Which is the best solution for:

...overwhelming user influx?

(Assuming user is of right type/attitude, too many users for acculturation capacity.)


...trending toward the mean?

(Assuming user is of wrong type/attitude, regardless of acculturation capacity.)


...other cause of cultural collapse?

Note: Ideas that involve splitting the registered users into multiple forums were not included for the reasons explained here.

Note: "The Center for Modern Rationality" was renamed to "The Center for Applied Rationality".

Submitting...

Comment author: faul_sname 09 December 2012 12:34:10AM 16 points [-]

You're focusing on negative reinforcement for bad comments. What we need is positive reinforcement for good comments. Because there are so many ways for a comment to be bad, discouraging any given type of bad comment will do effectively nothing to encourage good comments.

"Don't write bad posts/comments" is not what we want. "Write good posts/comments" is what we want, and confusing the two means nothing will get done.

Comment author: Viliam_Bur 09 December 2012 10:56:39PM 4 points [-]

We need to discourage comments that are not-good. Not just plainly bad. Only... not adding value, but still taking time to read.

The time lost per one comment is trivial, but the time lost by reading thousand comments isn't. How long does it take LW to produce thousand comments? A few days at most.

This article alone has about 100 comments. Did you get 100 insights from reading them?

Comment author: gjm 09 December 2012 12:34:12AM 9 points [-]

Why do the first three questions have four variations on the theme of "new users are likely to erode the culture" and nothing intermediate between that and "there is definitely no problem at all"?

Why ask for the "best solution" rather than asking "which of these do you think are good ideas"?

Comment author: FiftyTwo 09 December 2012 02:40:53AM 4 points [-]

Also, why is there no option for "new users are a good thing?"

Maybe a diversity of viewpoints might be a good thing? How can you raise the sanity waterline by only talking to yourself?

Comment author: Epiphany 09 December 2012 07:35:00PM *  5 points [-]

The question is asking you:

"Assuming user is of right type/attitude, too many users for acculturation capacity."

Imagine this: There are currently 13,000 LessWrong users (well more since that figure was for a few months ago and there's been a Summit since then) and about 1,000 are active. Imagine LesWrong gets Slashdotted - some big publication does an article on us, and instead of portraying LessWrong as "Cold and Calculating" or something similar to Wired's wording describing the futurology Reddit where SingInst had posted about AI "A sub-reddit dedicated to preventing Skynet" they actually say something good like "LessWrong solves X Problem". Not infeasible since some of us do a lot of research and test our ideas.

Say so many new users join in the space of a month and there are now twice as many new active users as older active users.

This means 2/3 of LessWrong is clueless, posting annoying threads, and acting like newbies. Suddenly, it's not possible to have intelligent conversation about the topics you enjoy on LessWrong anymore without two people throwing strawman arguments at you and a third saying things that show obvious ignorance of the subject. You're getting downvoted for saying things that make sense, because new users don't get it, and the old users can't compensate for that with upvotes because there aren't enough of them.

THAT is the type of scenario the question is asking about.

I worded it as "too many new users for acculturation capacity" because I don't think new users are a bad thing. What I think is bad is when there are an overwhelming number of them such that the old users become alienated or find it impossible to have normal discussions on the forum.

Please do not confuse "too many new users for acculturation capacity" with "new users are a bad thing".

Comment author: Epiphany 09 December 2012 12:37:27AM *  1 point [-]

Why do the first three questions have four variations on the theme of "new users are likely to erode the culture" and nothing intermediate between that and "there is definitely no problem at all"?

Why do you not see the "eroded the culture" options as intermediate options? The way I see it is there are three sections of answers that suggest a different level of concern:

  1. There's a problem.
  2. There's some cultural erosion but it's not a problem (Otherwise you'd pick #1.)
  3. There's not a problem.

What intermediate options would you suggest?

Why ask for the "best solution" rather than asking "which of these do you think are good ideas"?

A. Because the poll code does not make check boxes where you select more than one. It makes radio buttons where you can select only one.

B. I don't have infinite time to code every single idea.

If more solutions are needed, we can do another vote and add the best one from that (assuming I have time). One thing at a time.

Comment author: Nornagest 09 December 2012 01:24:38AM 0 points [-]

The option I wanted to see but didn't was something along the lines of "somewhat, but not because of cultural erosion".

Comment author: Epiphany 09 December 2012 01:55:34AM *  0 points [-]

Well, I did not imagine all the possibilities for what concerns you guys would have in order to choose verbiage sufficiently vague enough that those options would work as perfect catch-alls, but I did as for "other causes" in the comments, and I'm interested to see the concerns that people are adding like "EY stopped posting" and "We don't have enough good posters" which aren't about cultural erosion, but about a lapse in the stream of good content.

If you have concerns about the future of LessWrong not addressed so far in this discussion, please feel free to add them to the comments, however unrelated they are to the words used in my poll.

Comment author: Alicorn 09 December 2012 12:18:33AM 3 points [-]

It's the Center for Applied Rationality, not Modern Rationality.

Comment author: beoShaffer 09 December 2012 01:47:33AM *  1 point [-]

...some other cause?

I assign non-neglible probability to some cause that I not am not specifically aware of (sorta, but not exactly an outside context problem) having a negative impact on LW's culture.

Comment author: Armok_GoB 13 December 2012 12:39:02AM 0 points [-]

Proposed solution: add lots of subdivisions with different requirements.

Comment author: Epiphany 13 December 2012 10:25:53PM *  2 points [-]

I had a couple of ideas like this myself and I chose to cull them before doing this poll for these reasons:

The problem with splitting the discussions is that then we'd end up with people having the same discussions in multiple different places. The different posts would not have all the information, so you'd have to read several times as much in if you wanted to get it all. That would reduce the efficiency of the LessWrong discussions to a point where most would probably find it maddening and unacceptable.

We could demand that users stick to a limited number of subjects within their subdivision, but then discussion would be so limited that user experience would not resemble participation in a subculture. Or, more likely, it just wouldn't be enforced thoroughly enough to stop people from talking about what they want, and the dreaded plethora of duplicated discussions would still result.

The best alternative to this as far as I'm aware is to send the users who are disruptively bad at rational thinking skills to CFAR training.

Comment author: wedrifid 13 December 2012 10:49:09PM *  1 point [-]

The best alternative to this as far as I'm aware is to send the users who are disruptively bad at rational thinking skills to CFAR training.

That seems like an inefficient use of CFAR training (and so an inefficient use of whatever resources that would have to be used to pay CFAR for such training). I'd prefer to just cull those disruptively bad at rational thinking entirely. Some people just cannot be saved (in a way that gives an acceptable cost/benefit ratio). I'd prefer to save whatever attention or resources I was willing to allocate to people-improvement for those that already show clear signs of having thinking potential.

Comment author: Armok_GoB 14 December 2012 02:13:09AM 3 points [-]

I am among those absolutely hardest to save, having an actual mental illness. Yet this place is the only thing saving me from utter oblivion and madness. Here is where I have met my only real friends ever. Here is the only thing that gives me any sense of meaning, reason to survive, or glimmer of hope. I care fanatically about it.

Many of the rules that have been proposed. Or for that matter even the amount of degradation that has ALREADY occurred... If that had been the case a few years ago, I wouldn't exist, this body would either be rotting in the ground, or literally occupied by an inhuman monster bent on the destruction of all living things.

Comment author: Epiphany 14 December 2012 06:59:04AM *  6 points [-]

I'm fascinated. (I'm a psychology enthusiast who refuses to get a psychology degree because I find many of the flaws with the psychology industry unacceptable). I am very interested in knowing how LessWrong has been saving you from utter oblivion and madness. Would you mind explaining it? Would it be alright with you if I ask you which mental illness?

Would you please also describe the degradation that has occurred at LW?

Comment author: Armok_GoB 14 December 2012 10:39:03PM 2 points [-]

I'd rather not talk about it in detail, but it boils down to LW in general promoting sanity and connects smart people in general. That extra sanity can be used to cancel out insanity, not just creating super-sanes.

Degradation: Lowered frequency of insightful and useful content, increased frequency of low quality content.

Comment author: Epiphany 14 December 2012 06:55:29AM *  0 points [-]

I have to admit I am not sure whether to be more persuaded by you or Armok. I suppose what it would come down to is a cost/benefit calculation that takes into account the amount of destruction saved by the worst as well as the amount of benefit produced by the best. Brilliant people can have quite an impact indeed, but they are rare and it is easier to destroy than to create, so it is not readily apparent to me which group it would be more beneficial to focus on, or if both, in what amount.

Practically speaking, though, CFAR has stated that they have plans to make web apps to help with rationality training and training materials for high schoolers. It seems to me that they have an interest in targeting the mainstream, not just the best thinkers.

I'm glad that someone is doing this, but I also have to wonder if that will mean more forum referrals to LW from the mainstream...

Comment author: Armok_GoB 14 December 2012 02:06:31AM 0 points [-]

Ctrl+C, Ctrl+V, problem solved.

Comment author: Epiphany 14 December 2012 07:24:29AM 2 points [-]

If you're suggesting that duplicated discussions can be solved with paste, then you are also suggesting that we not make separate areas.

Think about it.

I suppose you might be suggesting that we copy the OP and not the comments. Often the comments have more content than the OP, and often that content is useful, informative and relevant. So, in the comments we'd then have duplicated information that varied between the two OP copies.

So, we could copy the comments over to the other area... but then they're not separate...

Not seeing how this is a solution. If you have some different clever way to apply Ctrl+C, Ctrl+V then please let me know.

Comment author: Armok_GoB 14 December 2012 10:35:21PM 0 points [-]

No, because only the top content in each area would be shared to the others.

Comment author: Eugine_Nier 15 December 2012 09:03:33PM 1 point [-]

This creates a trivial inconvenience.

Comment author: Armok_GoB 16 December 2012 02:49:32AM 0 points [-]

So add a "promote" button that basicaly does the same automatically.

Comment author: Vaniver 09 December 2012 08:38:38PM *  0 points [-]

The bad news is that LessWrong's IQ average has decreased on each survey. It can be argued that it's not decreasing by a lot or we don't have enough data, but if the data is good, LessWrong has lost 52% of it's giftedness since March of 2009.

What? The inflated self-estimates have dramatically declined towards more likely numbers. Shouldn't we be celebrating a decrease in bias?

Edit: My analysis of the public survey data; in particular, the number of responders is a huge part of the estimate. If you assume every non-responder has, on average, an IQ of 100, the total average LW IQ is 112. Much of the work might be done by the selection effects of self-reporting.

Comment author: Epiphany 09 December 2012 09:50:14PM *  1 point [-]

This might be virtuous doubt. Have you considered the opposite?

See Also:

Luke's article on overconfident pessimism.

My IQ related links in the OP.

Comment author: Vaniver 09 December 2012 10:14:17PM *  2 points [-]

Have you considered the opposite?

Yes, and I'm familiar with your IQ-related links in the OP*. But what's the opposite here? Let me make sure my position is clear: I agree that the people who post on LW are noticeably cleverer than the people that post elsewhere on the internet.

The narrow claim that I'm making is that the average self-reported IQ is almost definitely an overestimate of the real average IQ of people who post on LW, and a large change towards the likely true value in an unreliable number should not be cause for alarm. The primary three pieces of evidence I submit are:

  1. On this survey, around a third of people self-reported their IQ, and it's reasonable to expect that there is a systematic bias, such that people with higher perceived IQs are more likely to share them. I haven't checked how many people self-reported on previous surveys, but it's probably similarly low.

  2. When you use modern conversion numbers for average SAT scores, you get a reasonable 97th percentile for the average LWer. Yvain's estimate used a conversion chart from two decades ago; in case you aren't familiar with the history of psychometric testing, that's when the SAT had its right tail chopped off to make the racial gap in scores less obvious.

  3. The correlation between the Raven's test and the self-reported IQ scores is dismal, especially the negative correlation for people without positive LW karma. The Raven's test is not designed to differentiate well between people who are more than 99th percentile (IQ 135), but the mean score of 127 (for users with positive karma) was 96th percentile, so I don't think that's as serious a concern.

* I rechecked the comment you linked to in the OP, and I think it was expanded since I read it first. I agree that more than half of people provided at least one IQ estimate, but I think that they should not be weighted uniformly; for example, using the self-reported IQ to validate the self-reported IQ seems like a bad idea! It might be interesting to see how SAT scores and age compare- we do have a lot of LWers who presumably took the SAT before it was dramatically altered, and with younger LWers we can compare scores out of 1600 to scores out of 2400. It's not clear to me how much more clarity this will give, though, and how much the average IQ of LW survey responders actually matters.

Comment author: Epiphany 10 December 2012 12:19:43AM *  1 point [-]

What IQ would you correlate to the SAT numbers, considering?

As for the Raven's numbers, I am not sure where you're getting them from. I don't see a column when searching for "raven" in the 2012 spreadsheet, nor do I see "raven" on the survey result threads.

Comment author: Vaniver 10 December 2012 01:30:28AM *  2 points [-]

What IQ would you correlate to the SAT numbers, considering?

SAT scores, combined with the year that it was taken in, give you a percentile measure of that person (compared to test-takers, which is different from the general population, but in a fairly predictable way), which you can then turn into an IQ-equivalent.

I say equivalent because there are a number of issues. First, intelligence testing has a perennial problem that absolute intelligence and relative intelligence are different things. Someone who is 95th percentile compared to high school students in 1962 is not the same as someone who is 95th percentile compared to high school students in 2012. It might also be more meaningful to say something like "the median LWer can store 10 numbers in working memory, compared to the general population's median of 7" instead of "the median LWer has a working memory that's 95th percentile." (I also haven't looked up recently how g-loaded the SAT is, and that could vary significantly over time.)

Second, one of the main benefits may not be that the median LWer is able to get into MENSA, but that the smartest LWers are cleverer than most people have had the chance to meet during the lives. This is something that IQ tests are not very good at measuring, especially if you try to maintain the normal distribution. Reliably telling the difference between someone who is 1 out of 1,000 (146) and someone who is 1 out of 10,000 (155) is too difficult for most current tests; how many people would you have to base your test off of to reliably tell that someone is one out of a million (171) from their raw score?

As for the Raven's numbers, I am not sure where you're getting them from. I don't see a column when searching for "raven" in the 2012 spreadsheet, nor do I see "raven" on the survey result threads.

iqtest.dk is based on Raven's Progressive Matrices; the corresponding column, CV in the public .xls, is called IQTest. I referred to the scores that way because Raven's measures a particular variety of intelligence. It's seen widespread adoption because it's highly g-loaded and culture fair, but a score on Raven's is subtly different from a total score on WAIS, for example.

Comment author: Kindly 10 December 2012 12:55:33AM 1 point [-]

It's the "IQTest" column, corresponding to scores from iqtest.dk.

Comment author: Decius 09 December 2012 09:53:42PM 1 point [-]

How many believe that the current culture of LW should deviate exactly as much or more than it currently does from the culture of people who are likely to join (and therefore influence)?

Comment author: Epiphany 27 December 2012 01:22:37AM *  0 points [-]

It has occurred to me to wonder whether the poll might be biased. I wanted to add a summary of things that protect LessWrong against endless September when I wrote this post. However, I couldn't think of even one. I figured my thread to debate whether we should have better protection would have turned up any compelling reasons to think LessWrong is protected but it didn't.

I became curious about this just now wondering whether there really isn't a single reason to think that LessWrong is protected, and I re-read all of the comments (though not the replies to the comments) to see if I had forgotten any. This comment by AndrewHickey was the closest thing I found to an argument that there is something protecting LessWrong:

If anything, LW is far more at risk of becoming an echo chamber than of an eternal September. Fora can also die just by becoming a closed group and not being open to new members, and given that there's a fairly steep learning curve before someone is accepted here ("read the Sequences!") it would, if anything, make more sense to be reducing barriers to entry rather than adding more.

  1. The registration numbers showed that LessWrong is gaining members fast, so the echo chamber idea does not appear to be supported.

  2. As for the "steep learning curve" idea, the 2012 Survey Results show that only 1/4 of the survey respondents have read the Sequences, and that 60% of those who have participated either have not read them or have not finished them. Considering that the majority of participants haven't finished the sequences, I think LessWrong's steep learning curve is more likely to add to the risk than to have any protective benefits because if most people are going "Your culture is TLDR, I'm commenting anyway." then they're going to be participating without all the cultural knowledge.

One reflex is to think that the current karma system will protect LessWrong against endless September but thinking about that strategy further, one realizes that there is a limit to how many new posts older users can read and vote on, so this would not help if there were enough new users or users closer to the mean to overwhelm their voting capacity.

As far as I can tell, there's currently nothing that is likely to protect LessWrong from eternal September.