In the thread "LessWrong could grow a lot, but we're doing it wrong.", I explained why LessWrong has the potential to grow quite a lot faster in my opinion, and volunteered to help LessWrong grow.  Of course, a lot of people were concerned about the fact that a large quantity of new members will not directly translate to higher quality contributions or beneficial learning and social experiences in discussions, so I realized it would be better to help protect LessWrong first.  I do not assume that fast growth has to cause a lowering of standards.  I think fast growth can be good if the right people are joining and all goes well (specifics herein).  However, if LessWrong grows carelessly, we could be inviting an "Endless September", a term used to describe a never ending deluge of newbies that "degraded standards of discourse and behavior on Usenet and the wider Internet" (named after a phenomenon caused by an influx of college freshmen).  My perspective on this is that it could happen at any time, regardless of whether any of us does anything.  Why do I think that?  LessWrong is growing very fast and could snowball on it's own.  I've seen that happen, I saw it ruin a forum.  That site wasn't even doing anything special to advertise the forum that I am aware of.  The forum was just popular and growth went exponential.  For this reason, I asked for a complete list of LessWrong registration dates in order to make a growth chart.  I received it on 08-23-2012.  The data shows that LessWrong has 13,727 total users, not including spammers and accounts that were deleted.  From these, I have created a LessWrong growth bar graph:

 

 

 

  Each bar represents a one month long total of registration dates (the last bar is a little short, being that it only goes up until the 23rd).  The number of pixels in each bar is equal to the number of registrations each month.  The first (leftmost) bar that hits the top of the picture (it actually goes waaaay off the page) mostly represents the transfer of over 2000 accounts from Overcoming Bias.  The right bar that goes off the page is so far unexplained for me - 921 users joined in September 2011, more than three times the number in the months before and after it.  If you happen to know what caused that, I would be very interested in finding out. (No, September 2010 does not stand out, if you were wondering the same thing).  If anyone wants to do different kinds of analysis, I can generate more numbers and graphs fairly easily.

  As you can see, LessWrong has experienced pretty rapid growth.

  Growth is in a downward trend at the moment, but as you can see from the wild spikes everyplace, this could change any time.  In addition to LessWrong growing on it's own, other events that could trigger an "endless September" effect are:

  LessWrong could be linked to by somebody really big (see: Slashdot effect on Wikipedia).

  LessWrong could end up on the news after somebody does something news worthy or because a reporter discovers LessWrong culture and finds it interesting or weird.

  (A more detailed explanation is located here.)

  For these reasons, I feel it is a good idea to begin constructing endless September protection, so I have volunteered some of my professional web services to get it done.  This has to be done carefully because if it is not done right, various unwanted things may happen.  I am asking for any ideas or links to ideas you guys have that you think were good and am laying out my solutions and the pitfalls I have planned for below in order to seek your critiques and suggestions.

 

Cliff Notes Version:

  I really thought this out quite a bit because I think it's going to be tricky and because it's important.  So I wrote a cliff notes version of the below solution ideas with pros and cons for each which is about a tenth the size.

 

The most difficult challenge and my solution:

  People want the site to be enriching for those who want to learn better reasoning but haven't gotten very far yet.

  People also want an environment where they can get a good challenge, where they are encouraged to grow, where they can get exposed to new ideas and viewpoints, and where they can get useful, constructive criticism. 

  The problem is that a basic desire all humans seem to share is a desire to avoid boredom.  There is possibly a survival reason for this:  There is no way to know everything, but missing even one piece of information can spell disaster.  This may be why the brain appears to have evolved built-in motivators to prod you to learn constantly.  From the mild ecstasy of flow state (cite Flow: The psychology of peak experiencing) to tedium, we are constantly being punished and rewarded based on whether we're receiving the optimal challenge for our level of ability. 

  This means that those who are here for a challenge aren't going to spend their time being teachers for everybody who wants to learn.  Not everyone has a teacher's personality and skill set to begin with, and some people who teach do it as writers, explaining to many thousands, rather than by explaining it one-to-one.  If everyone feels expected to teach by hand-holding, most will be punished by their brains for not learning more themselves, and will be forced to seek a new learning environment.  If beginners are locked out, we'll fail at spreading rationality.  The ideal is to create an environment where everyone gets to experience flow, and no one has to sacrifice optimal challenge.

  To make this challenge a bit more complicated, American culture (yes, a majority of the visits, 51.12%, are coming from the USA - I have access to the Google Analytics) can get pretty touchy about elitism and anti-intellectualism.  Even though the spirit of LessWrong - wanting to promote rational thought - is not elitist but actually inherently opposite to that (to increase good decision making in the world "spreads the wealth" rather than hoarding it or demanding privileges for being capable of good decisions), there is a risk that people will see this place as elitist.  And even though self-improvement is inherently non-pretentious (by choosing to do self-improvement, you're admitting that you've got flaws), undoubtedly there will be a large number of people who might really benefit from learning here but instead insta-judge the place as "pretentious".  Interpreting everything intellectual as pretentious and elitist is an unfortunate habit in our culture.  I think, with the right wording on the most prominent pages (about us, register, home page, etc.) LessWrong might be presented as a unique non-elitist, non-pretentious place.

  For these reasons, I am suggesting multiple discussion areas that are separated by difficulty levels.  Presenting them as "Easy and Hard" will do three things:

  1. Serve as a reminder to those who attend that it's a place of learning where the objective is to get an optimal challenge and improve as far as possible.  This would help keep it from coming across as pretentious or elitist.

  2. Create a learning environment that's open to all levels, rather than a closed, elitist environment or one that's too daunting.  The LessWrong discussion area is a bit daunting to users, so it might be really desirable for people to have an "easy" discussion area where they can learn in an environment that is not intimidating.

  3. Give us an opportunity to experiment with approaches that help willing people learn faster.

 

Endless September protection should be designed to avoid causing these side-effects:

 

  Creating an imbalance in the proportion of thick-skinned individuals to normal individuals.

  Anything that annoys, alienates or discourages users is going to deter a lot of people while retaining thick-skinned individuals.  Some thick-skinned individuals are leaders, but many are trolls, and thick-skinned individuals may be more likely to resist acculturation or try to change the culture (though it could be argued the other way - that their thick skin allows them to take more honest feedback).  For example: anonymous, unexplained down votes create a gauntlet for new users to endure which selects for a high tolerance to negative feedback.  This may be the reason it has been reported that there are a lot of "annoying debater types".

 

  People that we do want fail to join because the method of protection puts them off.

  There are two pitfalls that I think are going to be particularly attractive, but we should really avoid them:

  1.) Filtering into hard/easy based on anything other than knowledge about rational thinking.  There are various reasons that could go very wrong.

    - Filtering in any other way will keep out advanced folks who may have a lot to teach.

    If a person has already learned good reasoning skills in some other way, do we want them at the site?  There might be logic professors, Zen masters, debate competition champs, geniuses, self-improvement professionals, hard-core bookworms and other people who are already advanced and are interested in teaching others to improve their skills, or interested in finding a good challenge, or are interested in contributing articles, but have already learned much of the material the sequences cover.  Imagine that a retired logic professor comes by hoping to get a challenge from similarly advanced minds and perhaps do a little volunteer work teaching about logic as a past time.  Now imagine requiring them to read 2,000 pages of "how to think rationally" in order to gain access to all the discussion areas.  This will almost guarantee that they go elsewhere.

    - Filtering based on the sequences or other cultural similarities would promote conformity and repel the true thinkers.

    If true rationalists think for themselves, some of them will think differently, some of them will disagree.  Eliezer has explained in undiscriminating skeptics that "I do propose that before you give anyone credit for being a smart, rational skeptic, that you ask them to defend some non-mainstream belief." he defines this as "It has to be something that most of their social circle doesn't believe, or something that most of their social circle does believe which they think is wrong."  If we want people in the "hard" social group who are likely to hold and defend non-mainstream beliefs, we have to filter out people unable to defend beliefs without scaring off those who have beliefs different from the group.

  2.) Discouraging people with unusually flawed English from participating at all levels.  Doing that would stop two important sources of new perspectives from flowing in:

    - People with cultural differences, who may bring in fresh perspectives.

    If you're from China, you may want to share perspectives that could be new and important to a Westerner, but may be less likely to meet the technical standards of a perfectionist when it comes to writing in English.

    - People with learning differences, whose brains work differently and may offer unique insight.

    A lot of gifted people have learning disorders and gifted people who don't tend to have large gaps between skill levels.  It is not uncommon to find a gifted person whose abilities with one skill are up to 40% behind (or better than) their abilities in other areas.  This phenomenon is called "asynchronous development".  We associate spelling and grammar with intelligence, but the truth is that those who have a high verbal IQ may not have equally intelligent things to say, and people who word things crudely due to asynchronous development (engineers, for instance, are not known for their communication skills but can be brilliant at engineering) may be ignored even though they could have important things to say.  Dyslexics, who have all kinds of trouble from spelling to vocabulary to arranging sentences oddly may be ignored despite the fact that "children and adults who are dyslexic usually excel at problem solving, reasoning, seeing the big picture, and thinking out of the box" (Yale).

   Everyone understands the importance of making sure all the serious articles get published with good English, but frequently in intellectual circles, the attitude is that if you aren't a perfectionist about spelling and grammar, you're not worth listening to at all.  The problem of getting articles polished when they are written by dyslexics or people for whom English is a second language should be pretty easy - people with English problems can simply seek a volunteer editor.  The ratio of articles being published by these folks versus the number of users at the site encourages me to believe that these guys will be able to find someone to polish their work.  Since it would be so easy to accommodate for these disabilities, taking an attitude that puts form over function as a filter would not serve you well.  If dyslexics and people with cultures different from the majority feel that we're snobby about technicalities, they could be put off.  This could already be happening and we could be missing out on the most creative and most different perspectives this way.

 

People who qualify under the "letter" of the standards do not meet the spirit of the standards.

  For instance:  They claim to be rationalists because they agree with a list of things that rationalists agree with, but don't think for themselves, as Eliezer cautions about in undiscriminating skeptics.  Asking them questions like "Are you an atheist?" and "Do you think signing up for cryo makes sense?" would only draw large numbers of people who agree but do not think for themselves.  Worse, that would send a strong message saying: "If you don't agree with us about everything, you aren't welcome here."

 

The right people join, but acculturate slowly or for some reason do not acculturate. 

  - Large numbers of users, even desirable ones, will be frustrating if newbie materials are not prominently posted.

  I was very confused and disoriented as a new user.  I think that there's a need for an orientation page.  I wrote about my experiences as a new user here which I think might make a good starting point for such a new user orientation page.  I think LessWrong also needs a written list of guidelines and rules that's positioned to be "in your face" like the rest of the internet does (because if users don't see it where they expect to find it, then they will assume there isn't one).  If new users adjust quickly, both old users and new users will be less annoyed if/when lots of new users join at once.

 

The filtering mechanism gives LessWrong a bad name.

  For instance, if we were to use an IQ test to filter users, the world may feel that LessWrong is an elitist organization.  Sparking an anti-intellectual backlash would do nothing to further the cause of promoting rationality, and it doesn't truly reflect the spirit of bringing everyone up, which is what this is supposed to do.  Similarly, asking questions that may trigger racial, political or religious feelings could be a bad idea - not because they aren't sources of bias, but because they'll scare away people who may have been open to questioning and growing but are not open to being forced to choose a different option immediately.  The filters should be a test about reasoning, not a test about beliefs.

 

Proposed Filtering Mechanisms:

 

  Principle One:  A small number of questions can deter a lot of activity.

  As a web pro, I have observed a 10 question registration form slash the number of files sent through a file upload input that used to be public.  The ten questions were not that hard - just name, location, password, etc.  Asking questions deters people from signing up.  Period.  That is why, if you've observed this trend as well, I think that a lot of big websites have begun asking for minimal registration info: email address and password only.  Years ago, that was not common, it seemed that everyone wanted to give you ten or twenty questions.  For this reason, I think it would be best if the registration form stays simple, but if we create extra hoops to jump through to use the hard discussion area, only those who are seriously interested will join in there.  Specific examples of questions that meet the other criteria are located in the proposed acculturation methods section under: A test won't deter ignorant cheaters, but they can force them to educate themselves.

 

  Principle Two:  A rigorous environment will deter those who are not serious about doing it right.

  The ideal is to fill the hard discussion area with the sort of rationalists who want to keep improving, who are not afraid to disagree with each other, who think for themselves.  How do you guarantee they're interested in improving?  Require them to sacrifice for improvement.  Getting honest feedback is necessary to improve, but it's not pleasant.  That's the perfect sacrifice requirement:

  Add a check box that they have to click where it says "By entering the hard discussion area, I'm inviting everyone's honest criticisms of my ideas.  I agree to take responsibility for my own emotional reactions to feedback and to treat feedback as valuable.  In return for their valuable feedback, which is a privilege and service to me, I will state my honest criticisms of their ideas as well, regardless of whether the truth could upset them."

  I think it's common to assume that in order to give honest feedback one has to throw manners out the window.  I disagree with that.  I think there's a difference between pointing out a brutal reality, and making the statement of reality itself brutal.  Sticking to certain guidelines like attacking the idea, not the person and being objective instead of ridiculing makes a big difference.  

  There are other ways, also, for less bold people, like the one that I use in IRL environments: Hint first (sensitive people get it, and you spare their dignity) then be clear (most people get it) then be brutally honest (slightly dense people get it). If I have to resort to the 2x4, then I really have to decide whether enlightening this person is going to be one of those battles I choose or one of those battles I do not choose.  (I usually choose against those battles.)

  How do you guarantee they're capable of disagreeing with others?  Making it clear that they're going to experience disagreements by requiring them to invite disagreements will not appeal to conformists.  Those who are not yet thinking for themselves will find it impossible to defend their ideas if they do join, so most of them will become frustrated and go back to the easy discussion area.  People who don't want intellectual rigor will be put off and leave.

  It's important that the wording for the check box has some actual bite to it, and that the same message about the hard discussion area is echoed in any pages that advise on the rules, guidelines, etiquette, etc.  To explain why, I'll tell a little story about an anonymous friend:

  I have a friend that worked at Microsoft.  He said the culture there was not open to new ideas and that management was not open to hearing criticism.  He interviewed with various companies and chose Amazon.  According to this friend, Amazon actually does a good job of fulfilling values like inviting honest feedback and creating an environment conductive to innovation.  He showed me the written values for each.  I didn't think much of this at first because most of them are boring and read like empty marketing copy.  Amazon.com has the most incredible written values page I've ever seen - it does more than sit there like a static piece of text.  It gives you permission.  Instead of saying something fluffy like: "We value integrity and honesty and our managers are happy to hear your criticisms." it first creates expectations for management: " Leaders are sincerely open-minded, genuinely listen, and are willing to examine their strongest convictions with humility." and then gives employees permission to give honest feedback to decision-makers: "Leaders (all employees are referred to as "leaders") are obligated to respectfully challenge decisions when they disagree, even when doing so is uncomfortable or exhausting.  Leaders have conviction and are tenacious. They do not compromise for the sake of social cohesion."  The Amazon values page gives their employees permission to innovate as well: "As we do new things, we accept that we may be misunderstood for long periods of time."  If you look at Microsoft's written values, there's no bite to them.  What do I mean by bite?

  Imagine you're an employee at Amazon.  Your boss does something stupid.  The cultural expectation is that you're not supposed to say anything - offending the boss is bad news, right?  So you're inhibited.  But the thing they've done is stupid.  So you remember back to the values page and go bring it up on your computer.  It says explicitly that your boss is expected to be humble and that you are expected to sacrifice social cohesion in this case and disagree.  Now, if your boss gets irritated with you for disagreeing, you can point back to that page and say "Look, it's in writing, I have permission to tell you."

  Similarly, there is, what I consider to be, a very unfortunate social skills requirement that more or less says if you don't have something nice to say, don't say anything at all.  Many people feel obligated to keep constructive criticism to themselves.  A lot of us are intentionally trained to be non-confrontational.  If people are going to overcome this lifetime of training to squelch constructive criticism, they need an excuse to ignore that social training.  Not just any excuse.  It needs to be worded to require them to do that and it needs to be worded to require them to do it explicitly despite the consequences.

 

  Principle Three:  If we want innovation, we have to make innovators feel welcome.

  That brings me to another point.  If you want innovation, you can't deter the sort of person who will bring it to you: the "people who will be misunderstood for long periods of time", as Amazon puts it.  If you give specific constructive criticism to a misunderstood person, this will help them figure out how to communicate - how else will they navigate the jungle of perception and context differences between themselves and others?  If you simply vote them down, silently and anonymously, they have no opportunity to learn how to communicate with you and what's worse is that they'll be censored after three votes.  This ability for three people to censor somebody with no accountability, and without even needing a reason, encourages posters to keep quiet instead of taking the sort of risk an innovator needs to take in presenting new ideas, and it robs misunderstood innovators of those opportunities for important feedback - which is required for them to explain their ideas.  Here is an example of how feedback can transform an innovator's description of a new idea from something that seems incomprehensible into something that shows obvious value:

  On the "Let's start an important start-up" thread, KrisC posts a description of an innovative phone app idea.  I read it and I cannot even figure out what it's about.  My instinct is to write it off as "gibberish" and go do something else.  Instead, I provide feedback, constructive criticism and questions.  It turns out that the idea KrisC has is actually pretty awesome.  All it took was for KrisC to be listened to and to get some feedback, and the next description that KrisC wrote made pretty good sense.  It's hard to explain new ideas but with detailed feedback, innovation may start to show through.  Link to KrisC and I discussing the phone app idea.

 

Proposed Acculturation Methods:

 

   Send them to Center for Modern Rationality

   Now that I have discovered the post on the Center for Modern Rationality and have see that they're targeting the general population and beginners with material for local meetups, high schools and colleges and they're planning some web apps to help with rationality training, I see that referring people over to them might be a great suggestion.  Saturn suggested sending them to appliedrationality.org before I found this but I'm not sure if that would be adequate since I don't see a lot of stuff for people to do on their website.

 

    Highlight the culture.

    A database of cultural glossary terms can be created and used to highlight those terms on the forum.  The terms are already on the page, so what good would this do?  Well, first they can be automatically linked to the relevant sequence or wiki page.  If old users do not have to look for the link, this speeds up the process of mentioning them to new users quite a lot.  Secondly, it would make the core cultural items stand out from all of the other information, which will likely cause new users to prioritize it.  Thirdly, there will be a visual effect on the page.  You'll be able to see that this place has it's own vocabulary, it's own personality, it's own memes.  It's one thing to say "LessWrong has been influenced by the sequences" to a new user who hasn't seen all those references on all of those pages, and even if they do see them, won't know where they're from, versus making it immediately obvious how by giving them a visual that illustrates the point.

 

    Provide new users with real feedback instead of mysterious anonymous down votes:

    We have karma vote buttons, but this is not providing useful feedback for new users.  Without a specific reason, I have no way to tell if I'm being down voted by trolls and I may see ten different possible reasons for being voted down and not know which one to choose.  This annoyance selects for thick-skinned individuals like trolls and fails to avoid the "imbalance in the proportion of thick-skinned individuals to normal individuals" side-effect.

    If good new users are to be preserved, and the normal people to troll ratio is to be maintained, we need to add a "vote to ban" button that's used only for blatant misbehavior, and if an anonymous feedback system is to be used for voting down, it needs to prompt you for more detailed feedback - either allowing you to select from categories, or give at least one or two words as an explanation.  Also, the comments need to should show both up votes and down votes.  If you don't know when you've said something controversial and are being encouraged to view everything you say as black-and-white good-or-bad, this promotes conformity.

 

     A test won't deter ignorant cheaters, but they can force them to educate themselves.

    Questions can be worded in such a way that they serve as a crash course in reasoning in the event that someone posts a cheat sheet or registrants look up all the answers on the internet.  Assuming that the answer options are randomly ordered so that you have to actually read them then the test should, at the very least, familiarize them with the various biases and logical fallacies, etc.  Examples:

    --------------

    Person A in a debate explains a belief but it's not well-supported.  Their opponent, person B, says they're an idiot.  What is this an example of?

    A. Attacking the person, a great way to really nail a debate.

    B. Attacking the person, a great way to totally fail in debate because you're not even attacking their ideas.

    --------------

    You are with person X and person Y.  Person Y says they have been considering some interesting new evidence of what might be an alien space craft and aren't sure what to think yet.  You both see person Y's evidence, and neither of you has seen it before.  Person X says to you that they don't believe in UFOs and don't care about person Y's silly evidence.  Who is the better skeptic?

    Person X because they have the correct belief about UFOs.

    Person Y because they are actually thinking about it, avoiding undiscriminating skepticism.

    --------------

    Note:  These questions are intentionally knowledge-based.  If the purpose is to avoid requiring an IQ test, and to create an obstacle that requires you to learn about reasoning before posting in "hard", that's the only way that these can be done.

 

    Encouraging users to lurk more. 

   Vaniver contributed this: Another way to cut down on new-new interaction is to limit the number of comments someone can make in a time period- if people can only comment once an day until their karma hits 20, and then once an hour until their karma hits 100, and then they're unrestricted, that will explicitly encourage lurking / paying close attention to karma among new members. (It would be gameable, unless you did something like prevent new members from upvoting the comments of other new members, or algorithmically keeping an eye out for people gaming the system and then cracking down on them.) [edit] The delay being a near-continuous function of the karma- say, 24 hours*exp(-b karma)- might make the incentives better, and not require partitioning users explicitly. No idea if that would be more or less effort on the coding side.

    Cons:  This would deter some new users from becoming active users by causing them to lose steam on their initial motivation to join.  It might be something that would deter the right people.  It might also filter users, selecting for the most persistent ones, or for some other trait that might change the personality of the user base.  This would exacerbate the filtering effect that the current karma system is exerting, which, I theorize, is causing there to be a disproportionate number of thick-skinned individuals like trolls and debate-oriented newbies.  My theory about how the karma system is having a bad influence

 

    Give older users more voting power. 

    Luke suggested "Maybe this mathematical approach would work. (h/t matt)" on the "Call for Agreement" thread. 

    I question, though, whether changing the karma numbers on the comments and posts in any way would have a significant influence on behavior or a significant influence on who joins and stays. Firstly, votes may reward and punish but they don't instruct very well - unless people are very similar, they won't have accurate assumptions about what they did wrong. I also question whether having a significant influence on behavior would prevent a new majority from forming because these are different problems. The current users who are the right type may be both motivated and able to change, but future users of the wrong type may not care or may be incapable of changing. They may set a new precedent where there are a lot of people doing unpopular things so new people are more likely to ignore popularity. The technique uses math and the author claims that "the tweaks work" but I didn't see anything specific about what the author means by that nor evidence that this is true. So this looks good because it is mathematical, but it's less direct than other options so I'm questioning whether it would work.

  Vladimir_Nesov posted a variation here.

 

  Make a different discussion area for users with over 1000 karma.

  Posted by Konkvistador here.

 

  Make a Multi Generation Culture.

  Limit the number of new users that join the forum to a certain percentage per month, sending the rest to a new forum.  If that forum grows too fast, create additional forums.  This would be like having different generations.  New people would be able to join an older generation if there is space.  Nobody would be labeled a "beginner".

 

  Temporarily turn off registration or limit the number of users that can join.

  (See the cliff notes version for more.)

 

Should easy discussion participants be able to post articles?

  I think the answer to this is yes, because no filtering mechanism is perfect and the last thing you want to do is filter out people with a different and important point of view.  Unless the site is currently having issues with trolls posting new articles, or with the quality of the articles going down, leaving that freedom intact is best.  I definitely think, though, that written guidelines for posting an article need to be put in "in your face" expected places.  If a lot of new users join at once, well-meaning but confused people will be posting the wrong sorts of things there - making sure they've got the guidelines right there is all that's probably needed to deter them.

 

Testing / measuring results:

  How do we tell if this worked?  Tracking something subjective like whether we're feeling challenged or inundated with newbies is not going to be a straightforward matter of looking at numbers.  (Methods to assist willing people learn faster deserves it's own post.)  Just because it's subjective doesn't mean tracking is impossible or that working out whether it's made a difference cannot be done.  I suspect that a big difference will be noticed in the hard discussion area right away.  Here are some figures that are relevant and can be tracked, that may give us insight and ways to check our perceptions:

  1.  How many people are joining the hard forum versus the easy forum?  If we've got a percentage, we know how *much* we've filtered, though we won't know exactly *who* we've filtered.

  2.  Survey the users to ask whether the conversations they're reading have increased in quality.

  3.  Survey the users to ask whether they've been learning more since the change.

  4.  See which area has the largest ratio of users with lots of vote downs. 

  (This could be tricky because people who frequently state disagreements might be doing a great service to the group, but might be unpopular because of it, and people who are innovative may be getting voted down due to being misunderstood.  One would think, though, that people who are unpopular due to disagreeing, or being innovative, assuming they're serious about good reasoning, would end up in the hard forum.) 

 

Request for honest feedback:

  Your honest criticisms of this idea and your suggestions will be appreciated, and I will update this idea or write a new one to reflect any good criticisms or ideas you contribute.

 

This is in the public domain:

  This idea is hereby released into the public domain, with acknowledgement from Luke Muehlhauser that those were my terms prior to posting.  My intent is to share this idea to make it impossible to patent and my hope is that it will be free for the whole world to use.

  Preventing discussion from being watered down by an "endless September" user influx. by Epiphany is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

 

New to LessWrong?

New Comment
102 comments, sorted by Click to highlight new comments since: Today at 5:31 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Regarding elitism: LW is elitist, and would not be what it is without its elitism. What else differentiates LW from /r/skeptic or agi-list? The LW community recognizes that some writings are high quality and deserve to be promoted, and others are not. If anything, I wish LW would become more elitist.

-5Epiphany12y

Add a check box that they have to click where it says "By entering the hard discussion area, I'm inviting everyone's honest criticisms of my ideas. I agree to take responsibility for my own emotional reactions to feedback and to treat feedback as valuable. In return for their valuable feedback, which is a privilege and service to me, I will state my honest criticisms of their ideas as well, regardless of whether the truth could upset them."

I am strongly opposed to requiring a version of Crocker's rules to get into good discussions. Wanting to have people be civil to me does not particularly mean that I compromise intellectual rigor. A forum that required Crocker's Rules to participate could be interesting, but it could also be 4chan.

And poorly-informed ranters hereabouts typically think they are the smartest people on the site and the sheep around them are too cowardly, emotional, or stupid to "rationally debate" them. Adding an explicit norm according to which poorly-informed ranters may identify themselves as magnanimously granting favors to everyone they harass, and people who object to such harassment as in violation of that norm (because they're not "taking responsibility for their emotional reactions") would not be an improvement. (Of course they'll ignore the part about receiving feedback as a favor.)

0Epiphany12y
You seem to be assuming that honest criticism also has to mean throwing manners out the window. I do not assume this, and it didn't occur to me to predict that others would assume that when I was writing this. I'll have to update that part. Poorly informed ranters wanting to debate does sound annoying, I didn't realize there was a problem with that. It seems to me the best way to deter them would be to paste a link that's directly related to their points and ignore them. Do that enough times and they'll probably wake up and realize they've got a problem with not knowing what they're talking about. What have you guys tried? Maybe a better question would be "What would you suggest?"
4Manfred12y
Hm - rather, I'd say she is assuming that the words you posted, which she then quoted, would reduce civility for a net loss. I'll requote what I thought was the relevant (i.e. most disagreed-upon) part: Making good posts requires that you take a large measure of responsibility for the audience's response. And this is a skill that is difficult to learn/teach as a new user/culture. Having something like the quoted phrase in an authoritative place would send conflicting messages about what constitutes a "good post," leading to fewer people learning the skill of writing for their audience.
-2Epiphany12y
I agree in that I think most people would interpret my wording to mean "Throw manners out the window in favor of honesty." but I don't think it has to be that way. As far as taking responsibility for emotions goes, there's a limit to what you can do. If you have to tell them something unpleasant and disappointing, if that's the truth, you can't control the fact that they're going to be disappointed. If you sugar-coat, they may not realize the gravity of the situation, and what happens next could be worse. Reality is sometimes unpleasant, that's all there is to it. I can want you to be happy all I feel like, but if the reality isn't happy for you, there's nothing I can do about that. If I know of a solution, I'll usually say so. If not, I can say things like "I know you really care about this, so I hate to say this..." and reassure them that I don't dislike them, but that doesn't change the fact that the reality is unpleasant.
0Manfred12y
"Throw manners out the window" is not what I said you were proposing. I think you may be missing some of what I am saying, or maybe I was just being opaque. So I'll try and give you one clear paragraph: Thinking about other peoples' emotional responses makes communicating with them much more effective, not less effective. If we want to have a "hard discussion" section, or even just a difficult discussion, I want people to be in the habit of thinking about other peoples' emotional responses, not to consider it "not their responsibility." To be clear, when I say "thinking about other peoples' emotions," I don't mean typical "manners stuff" like sweetening difficult truths, etc. I mean actual thinking, about other peoples' emotions. And changing what you say so that the other person will understand what you're trying to communicate. That part's important! Or to put it another way, in order to communicate as best you can, you must take responsibility for your audience's emotional responses insofar as they affect what happens to your message, which is often a lot.
0Epiphany12y
Yeah, that's worthwhile, and it's an art. I'm not sure how that would even be communicated to people if it were, say, put into the rules or something. It would be nice if that level of quality could be expected but I don't see any way to do that. Do you?
0Manfred12y
It might be sweet to find some existing experts in teaching people to speak so that they will be understood by people with complicated and relevant internal states. (Relationship counselors? People who teach autistic people conversation skills? Psychologists who study conversation? Psychologists who study the difference between what the speaker thinks and what the listener thinks?) Anyhow, maybe teaching people this is a near-solved problem, maybe not (and maaaaybe I'll do some research on this before next time I talk about it :D ). And maybe it's unsolvable. But I'd guess it's solvable - lots of things that seem impossible are really us being bad at the skill that makes it possible.
2CWG12y
I haven't come across this either. Doesn't the downvoting minimize this problem? That said, I like civility to be one of the core principles of any discussion group - but without every feeling we have to agree with what someone else is saying.
4Alicorn12y
No, not really. They say things like "hahahaha, sure, downvote me more, that only proves me right, you're unable to actually address my arguments!" And then people try to address their arguments and get nowhere. It's a remarkably consistent type, actually. This problem is one of the things the controversial new trollfeeding tax is meant to handle.
2wedrifid12y
Spot on. And bizarrely enough there even seems to be a remarkable correlation in the kind of positions this type supports. Something along the lines of an "Incorrect Metacontrarian Cluster".
3gwern12y
http://websites.psychology.uwa.edu.au/labs/cogscience/documents/LskyetalPsychScienceinPressClimateConspiracy.pdf seems relevant:
-1Epiphany12y
Votes don't train newbies. Being a new users who gets voted down sometimes, I can tell you it seems completely random. I can't tell whether it's a troll, or someone with a vendetta or what it is. And even if I brainstorm a bunch of guesses, the little number at the top of my comment doesn't tell me which one is correct. This expectation that downvotes are going to help new users learn how to behave is even worse than that though, in a whole bunch of ways at once. I wrote about that here: Idea For Karma Improvements and Why We Need Them
3wedrifid12y
Yes they do. You have been given an abundance of explanations regarding people's reactions which you could, if it is your desire, use to gain more support for your comments. My model of the reception of your comments suggests that you do have several people with a 'vendetta', or at least several people who are highly predisposed to downvote you prior to reading your contributions. But that is to be expected. I get people targetting me all the time and if I didn't it would probably be a sign that I was neglecting my duty. Having a few individuals targetting you isn't a problem. The problem comes when you cannot garner sufficient support from the other, neutral readers to counter the initial downvotes and leave most of your comments as net positive. That is a sign that is worth paying more attention to politics and perception---and again you've got personal feedback you could use toward that end. Are you really saying that, if motivated, you couldn't work out how to change your behavior such that your comments were more likely to be well received? I mean come on, the thought "Oh, I suppose I should convey less arrogance" is a good starting place for reducing social sanction in just about any social structure that you are relatively new member of. (Note that I am talking specifically about conveyed arrogance, not actual arrogance. People can get away with being completely obstinate and incapable of learning from the words of others so long as they send the right signals of humility.)
-1Epiphany12y
You read that comment completely out of context and also you seem unaware that at first I was not getting constructive criticism. People only started criticizing me after I decided I was tired of unexplained downvotes and started to advertise in various places (at the ends of my discussion posts, and in my various comments expressing an interest in being challenged intellectually) that I genuinely want honest criticism. My experience is that LessWrong members needed to be convinced that it really was okay to criticize me before they started giving me the large amount of helpful feedback you're seeing. You're very bold, Wedrifid, so you probably figure other people are as comfortable criticizing others as you are. Maybe you think I must have been getting bold criticisms this whole time. I wasn't. The context in which I wrote that comment was this: I was explaining that OTHER new people don't get feedback, in order to explain that the downvotes aren't training them. If you think about it, you even said in your own post that it was the explanation that people use to improve themselves. The votes aren't the same as verbal feedback. Are the other newbies getting the kind of feedback I am? I bet most of them aren't. I was outgoing enough to guess that the reason I wasn't getting feedback is because people didn't feel comfortable criticizing me and chose to begin advertising that I want honest criticisms. I doubt most of your newbies are doing the same thing. Try an experiment. Make a new account. Post things people won't like. See how many of them actually get verbal feedback. Then, advertise that you want constructive criticism. Post the same number of things people won't like, and count how many of those got you verbal feedback. I find your perspective on vendettas and duty refreshing, so thank you. Your comment makes me feel glad that you think I am worth saving. But since you intended to save me from my own stupidity, I feel a little annoyed that you thought I neede
6wedrifid12y
I'm not sure to what extent I did that, but in any case I have a core disagreement with the claim that downvotes do not train newbies. My expectation is that the simple feedback mechanism increases the speed at which newbies absorb local norms and all my observations thus far confirm this. It isn't the only thing that teaches newbies and it isn't a perfect mechanism but it certainly helps. Most people don't like getting downvotes and are take action to avoid them. My position is that even in the absence of any explicit verbal feedback downvotes do train newbies (and non-newbies). Verbal explanations can also help (and sometimes hinder). I expect that there is plenty of scope for improving newbie learning through constructive feedback---this is something that complements and works alongside the karma system, not something made necessary because the the karma system is completely ineffective for the purpose. It is almost always a bad idea to use oneself as an example when making any kind of general criticism of the karma system. Disagreement will inevitably seem personal! I disagreed. This is a testable prediction but not easily so. With a suitably designed experiment I would predict a greater degree of learning in the voted on but not explained group than you would. To be clear I think the power is in the vote buttons AND in clear communicaiton. I am of course willing to use different phrasing. I was intending to convey that it is well within your capability to avoid downvotes if that was a task you set for yourself. It is legitimate to have other higher priorities than avoiding downvotes but those who are not trying to avoid them may appear not to be learning from them. That is, I was questioning that the "Ephiphany" anecdote is an indication that newbies do not learn from downvotes because they don't have enough information. I acknowledge from the parent that you are referring to earlier experience prior to you changing the way you interact and so the above is
1bogus12y
If you're willing to make Crocker's rules a codified and accepted norm of the "hard discussion area", you might as well go the whole way and make it very clear to ranters how wrong they are, in the most obnoxious way you can come up with - including flames, status putdowns, etc. Yes it sounds distasteful and it is, but it has some very compelling advantages: (1) it deters other users from naïvely expending effort on unproductive discussions, which is something Eliezer has been complaining about; (2) it will hopefully discourage the vast majority of ranters, thus allowing us to minimize the scope of controversial technical measures such as bans and posting restrictions and restrict them to the most intractable cases. Just "pasting a link that's relevant to their points" is not nearly enough to discourage anyone.

In my experience, being obnoxious doesn't deter others from being obnoxious. Quite the opposite, in fact.

I'm not very worried about an endless September. LW is pretty good at downvoting people when they make rookie mistakes in reasoning and argument, or when they are mean or trollish. The new troll toll (or whatever solution we settle into after a few months) should go even further toward preventing endless September. Moreover, I think the content itself here does a fairly good job of filtering out many kinds of people we don't want.

Finally, I think Xachariah's point is important: "the eternal September effect is primarily caused by new-member with new-member interaction." I would say that LW already does a good job of limiting this. For example, new members who don't understand the culture are downvoted, which means their comments are hidden by default. Also, people are already incentivized to lurk for quite a while before commenting or posting, because the community is clearly intelligent and is constantly using community jargon they could easily be downvoted for misunderstanding.

I also don't think we should make it harder for people to join (e.g. with a quiz). Instead, I think we should make it even easier for the kinds of people we want to find LW and engage. Here are so... (read more)

5John_Maxwell12y
What's the case for a video? Seems a little cheesy, IMO. OK, but let's make sure they really do have some domain expertise in the area that they're leaving comments in, so they don't make us look bad. Link. I like this idea. One potential logistical glitch: If the user isn't already familiar with reddit, they won't know what an orange envelope means and they may just see it as orange forever and never click on it.
2lukeprog12y
People like videos and it makes the community more human to newcomers.

People like videos? I hate videos to the point that I will go out of my way to avoid links with videos in them, and I've seen this sentiment expressed by other people here.

7Epiphany12y
I hate video because it goes too slow. I can read at least twice as fast as a video goes. It always feels like such an excruciating waste of time. Also, I can't use find in page. I am addicted to find in page. Ctrl-F and me are attached at the hip. Of all the pages I open, the proportion I read in entirety is very small. Ctrl-F is like half my way of navigating the internet. I'm really glad to see someone else express this. I thought i was the only one.
3komponisto12y
I like videos. They are more passive than written text and feel less cognitively demanding per unit time. In fact, I will often prefer to watch/listen-to a video/audio recording more than once in order to achieve the same level of retention as reading text in a concentrated fashion, thereby exchanging time for concentration-willpower.
0Kindly12y
I suppose I have nothing to complain about as long as the transcript is present and easy to get to.
2A1987dM12y
Some do and some don't.
2John_Maxwell12y
I seem to recall lots of complaints on lukeprog's first Q&A about the fact that the answers were delivered in video format.
0curiousepic12y
FYI he also provided a text transcript.
0TraderJoe11y
[comment deleted]
0gwern11y
Transcripts are fairly expensive; patio11 pays for transcripts to be made for his podcasts (a big factor in why those submissions do well on Hacker News), but IIRC the quoted figure is north of $100. So you would pay... but would you pay enough?
0TraderJoe11y
[comment deleted]
0gwern11y
College students would be flaky and unreliable, and you'd want at least 2 for error-checking. You get what you pay for.
0Curiouskid11y
Confirmation bias and selection effects?
0Karmakaiser12y
Echoing komponisto, my job is incredibly non demanding of my cognitive resources so I constantly listening to audiobooks, youtube channels, and TCC/TMS Lectures at 2x speed. Over the course of an 8 hr work day I can finish about 200 pages at reasonable comprehension.
1John_Maxwell12y
I'd be curious to hear your evidence for this. In any case, even if there is conclusive evidence that internet users prefer video presentations over corresponding text presentations, it's not obvious that this trend extends to LWers or potential LWers. Also, this seems to have been a flop. I suspect that if videos were a good fit for LW concept transmission, we'd have seen more success with that small experimental effort.
2lukeprog12y
Those video experiments were very poorly produced. That's not the kind of video I have in mind. And video would of course only be there in addition to text.
0RomanDavis12y
I would have enjoyed and reccomended even poorly produced videos if you guys had bothered to extend them. I keep meaning to finish the last third or so of the sequences I haven't read, but their all over the place and it makes sense for me to start from the top. It'd be great if I could listen while doing other things. In my case, painting mostly, in other cases, probably cleaning, laundry, dishes, pet care and other activities that take up very low or no verbal mental resources. Guys. It's not rocket science. You're smart. You have good content. Present it well. Or better. If you can't do that, hire someone who can. Get it out there. If you can't do that, hire someone who can.
-2Epiphany12y
You are the decision maker here who will determine whether any changes go into effect. If this is your final decision on the matter of preventing endless September, let me know, and I will place a note at the top of this discussion to prevent people from continuing to waste time on it. If not, then I will focus on debating it with you because I still disagree but it would be a waste of time for me to move forward if you feel that LessWrong needs no more protection against endless September.
5lukeprog12y
No, I want to be debated. I might change my mind.
0Epiphany12y
Okay. I will start a new post specifically as a call to agreement for us decide whether LessWrong should have better endless September protection. It will take me a little while to get it all organized. Give me a bit of time.

Or, we could debate the subject here.

8Epiphany12y
Well, you were right. Thumb up.
2Epiphany12y
Well I thought it out carefully and added in citations and whatnot, so now it's kinda too long for a comment. Sorry you did not get the post where you wanted it, but it's done now. here it is.

Filtering is not the answer.

As noted the eternal September effect is primarily caused by new-member with new-member interaction. Instead of taking cultural cues from established members, new-members take cultural queues from other new-members and learn incorrect cultural lessons. Mechanisms to prevent eternal September are to assimilate new members more rapidly and to dissuade new-members from posting as much until they have been assimilated (and especially dissuade them from influencing other new members). Filtering is only useful in that it retards the acquisition of new recruits slow enough to allow the old recruits to assimilate.

Assuming we're in danger of an eternal September, the correct question to ask is not, "How do we filter better?" but "How do we convince new members to lurk until they're assimilated?"

the eternal September effect is primarily caused by new-member with new-member interaction.

An obvious solution : Make the site appear, to new members, as if only (some desired fraction) of members are new.

Distinguish between "new" and "experienced" members. Let new members turn into experienced members when they meet some criteria, possibly post count, karma, or even votes by experienced members. Systematically prevent new members from interacting with too many other new members by simply not showing them the posts made by these other new members.

I'm not actually sure if I think this is a good idea, but it might be worth mentioning anyway.

8drethelin12y
This seems absurdly hard to implement
1matt12y
It seems not hard to implement naively. Discussion threads would truncate for new users from new user comments (experienced user comments on new user comments would be invisible to new users). Our caching gets more complicated. Many candidate tests for "experienced" seem obvious, but some might be very easy to game (funny comments on HPMOR posts qualify you).
0Caspian12y
If this is done, posts upvoted past a threshold should also be visible to everyone.
0Epiphany12y
This does nothing to increase the capacity of older members to tolerate newbies - and that's important, too. You'd be giving all the older members ... how many times as many messages? I'm new, and I can't keep up with my messages. I can't imagine what it would do if I was an old member, and all of these new people were responding to me. If I were an old member in that situation, I would try to ignore the new users, and also, I would become increasingly annoyed with them demanding so much of my attention. That would lower the value of using the forum, and it may cause old members to quit. It would also frustrate old members when new members weren't aware of each other's comments. That would be confusing. Do you see a way to resolve these issues?
-2Epiphany12y
There's a limit to how fast this can be done. That's, essentially, why something additional is needed. Deterring them from posting will ward off good people because they'll lose momentum or be annoyed, and will increase the proportion of thick-skinned and / or persistent types who can deal with the annoyance. Not all thick-skinned / persistent people are bad, some are leaders or are gifted with those abilities, but creating gauntlets of annoyance will increase the proportion of undesirable thick-skinned / persistent types like trolls, newbie debaters, etc. Essentially, dissuasion IS filtering, so if you're going to filter, you may as well be conscious of it and use a method that is likely to attract the type that you want. My questionnaire would filter for people who like learning or don't mind looking things up. The karma system currently in place filters for trolls and debaters. Dissuading people from posting will exacerbate the effect of the karma system if it remains as-is. The combination of the two would may result in a hideous unintentional filter. If done well, it would also encourage a higher proportion of people that are the right type, discouraging mainstream people who aren't genuinely interested in the culture from creating a new majority and taking over. Which is why i suggested the questionnaire that I did. That would select for people genuinely interested in rationality, most others won't take the time to fill out such a questionnaire. I disagree, but it might be "How do we convince members to lurk until their assimilated without scaring any of them off

Public domain and creative commons are not the same thing. In particular, I don't think you can make a share-alike requirement on a public domain item.

8RHollerith12y
You missed a more important and fundamental misconception: namely, the OP is trying to apply copyright-related practices (releasing into the public domain, Creative Commons's licenses) to ideas. In other words, he is confusing patents and copyrights. Furthermore, although it is noble for the OP to try to keep a line of innovation free from patents, the OP's written promise not to apply for a patent on something probably has no legal weight because it was not made in exchange for any kind of consideration. (The requirement that a promise maker obtain some sort of considerations for the promise to be enforceable in a court of law is a basic principle of contract law.). Note that "I hereby place this post in the public domain," and "I hereby give everyone a license to this post under Creative Commons bla bla," are exceptions to the general rule that promises made "without consideration" are not legally enforceable, but, again, releasing into the public domain and Creative Commons's licenses have nothing to do with ideas or patents. The most important thing about patents is that the vast majority of actors who are sued for infringing a patent are selling at least tens of millions of dollars a year in infringing products or services. In other words, the vast expense of patent litigation means that most people using technology to improve the world can safely ignore patents (plans to improve the world that entail someone's selling tens of millions of dollars a year worth of goods and services probably being the biggest exception). The second most important thing to know about patents, by the way, is that sometimes venture capitalists will refuse to invest in a company either because the company lacks patents or has competitors who have patents, but this is really just a corollary of the first most important thing about patents, since almost all venture-capital investment is made with the hope that the investee will someday sell at least hundreds of millions of dollars a
0Epiphany12y
Thank you. But wait. A copyright and patent are not the same thing. If you release the rights to a patent, might you still retain the copyright, because it is different?
2RHollerith12y
Well, yeah, but if you decide to hoard the copyright on your post (i.e., the post above), that decision would not prevent anyone from creating or selling a product or service that incorporates inventions described in the post. The only thing your copyright on your post can make illegal is the making of copies of the exact same sequence or almost-exact same sequence of words in your post.
1fubarobfusco12y
No. For instance, a movie based on the post might not involve any common sequences of words. Especially if it's silent.
2RHollerith12y
I accept the correction. ADDED. Well, if I wanted to get technical, I would point out that the post is near the lower limit in size of works that can be copyrighted. That is, even in the best of circumstances, it would be difficult to prevail in a copyright infringement suit on the basis of such a small number of words, and the particular part of copyright law that deals with movies based on novels is probably far from the best of circumstances. So, I could make the technical argument that my statement was probably correct because I was referring to one particular rather-short Less-Wrong post, not copyrightable works in general including things like novels. But enough!
4CWG12y
That's correct. "Public domain" is sometimes used in a much vaguer sense to mean the information is out there and being used and shared, but this vaguer sense is best avoided. I suggest to Epiphany to either: * Strike out "public domain" and replace it the idea of being "open licensed", or * As Alicorn suggests, declare it public domain. Creative Commons has tools for this (the advantage being that you give a lot more clarity - so I know that you mean the same thing as I understand by public domain). See the nice summary and: Apply CC0 to your own work. Hope that's helpful.
1Epiphany12y
Well that goes to show how much I know about law. You have successfully detected my "throw everything at it but the kitchen sink" strategy. I have no idea how to fix this. But thank you for trying to help.
8Alicorn12y
Public domain is by far the more permissive option. If you want public domain, just go with that.
2Epiphany12y
I would like to be able to take your advice but I don't know enough about the law to tell who knows enough about the law that I should actually take their advice. This is a riddle.

'Eternal September' situations are caused by new users as a proportion of total users. 10,000 new users in a month could cause an eternal September here, but 10,000 new users would barely be a blip on, say, Pinterest's culture. Growth at a steady percentage of users seems like the safest way to head off an eternal September situation; as the site gains more users it becomes naturally more resilient to culture shift by newbies.

Is LessWrong increasing in growth%, decreasing in growth%, or are growth% numbers roughly staying the same?

-2Epiphany12y
I agree that it is safer to keep growth at a manageable pace than it is to try and grow it faster while also trying to prevent endless September. However, I would disagree with the idea that managing the pace of growth will prevent endless September. So relying on pacing to solve the problem seems like a bad idea to me. My perspective is that groups trend toward normal over time as they grow larger - regardless of the pace. I think this is most noticeable during a deluge, that a large deluge will definitely speed up the process, and that growth curves are such that this can seem to happen over night. It may be "correlation implies causation" to assume that the reason a culture is watered down is due to too many people, I think it's because it eventually attracts a snowball of people who are ever less like the original group. People who start a new group have some kind of difference - why would you break away from the herd if this is not so? Once you've started the group, it is less likely to find people who are very much like you than people who are just plain similar. So as the group grows, the people attracted to it are similar to the ones already there, but not really similar. Of course, as the circle of people enlarges, what is defined as similar becomes more relaxed over time. Different people are necessarily overwhelmed by mainstream people eventually due to the fact that there are more mainstream people than different people. That's the way I've observed it happening. If you know an example of a culture that actually stayed the same with a very large numbers of users, I'd be interested in hearing about that. You mentioned Pinterest, but do we know whether their beginning culture was retained? It sounds like you are just saying the current culture would be retained. For all I know, they've already been through an endless September and the culture they have now is very different from the one they began with. I'm glad you brought this up, worded in that way, b

Summary of Solution Ideas:

(In alphabetical order.)

A ban button for older active users that works if pressed by enough of them in a certain time period.

Pros:

  • Increases the capacity of the site to deal with an influx of trolls.
  • Frees up vote buttons to do what they're intended to do. For instance: People probably don't down vote nearly as much as would reflect their opinion since it triggers a troll tax and hides the comment.
  • Prevents "feeding trolls" by giving the trolls negative attention.
  • No need to rely on moderators to be good at being s
... (read more)
1cata11y
This is pretty thought-provoking; thanks for laying it all out. I think each of the devils are in their respective details. People have very different intuitions about, for example, how many people will be turned off by a quiz requirement, or how many useful contributions would be cut off by a karma restriction on comment quantity, and it's hard to make progress toward quantifying that without running experiments which may be temporarily harmful, have confounding factors, and take a lot of manpower. In the end, we usually settle on "loudest intuition wins" but it would be nice to make some progress on that.
0beoShaffer11y
I'm not sure how technically feasible it is, but I'd be interested in having something like the WikiWords system from MediaWiki(the base for TV Tropes) for internal links and/or links to the wiki. I already try to link to them whenever relevant, but it's a non-neglible inconvenience to find the right urls and add the right markup. Perhaps (down)voting could automatically open a reply box, thus encouraging more detailed feedback while still allowing user discretion. More feedback is usually good, but sometimes someone has already written a good critique that I can just upvote or something. So I don't like making it mandatory. -edited to clarify that I meant MediaWiki rather than the TV Tropes specific variant.
3Epiphany11y
Mm good idea, I don't know why I overlooked that (making it prompt the user when voting rather than requiring it) I will change the idea.
3Nornagest11y
TV Tropes' markup system is a godawful homegrown mess and I wouldn't recommend using it; it'd be incompatible with wiki markup and unfamiliar to pretty much everyone that hasn't done time on TV Tropes. Incorporating some subset of MediaWiki markup into the blog wouldn't be a bad idea, though.
0beoShaffer11y
Sorry, I was thinking of MediaWiki, but I put TV Tropes because I just finished explaining the parts of MediaWiki I like in the context of explaining TV Tropes, and I don't uses any other MediaWiki sites, so I TV Tropes was much more mentally salient than MediaWiki.
0Nornagest11y
TV Tropes is based on pmwiki, actually, although it's got a great deal of homebrew code on top of that (including much of its markup). MediaWiki's what Wikipedia uses, along with the Less Wrong wiki and many other post-Wikipedia wikis. The two are both written in PHP and accept SQL backends, but they don't have much in common in terms of interface, and there are pretty substantial differences in markup as well. I haven't spent a lot of time in MediaWiki, but for example it doesn't do pmwiki-style WikiWords; internal links are established via [[double square brackets]] instead.
0beoShaffer11y
Ok, so what I'm trying to say is I want WikiWords, approximately like whats offered on TV Tropes, and I was going along with what I thought you were saying because I don't do any other wikis or know much about TV Tropes codebase.

(This used to be the draft of my endless September poll)

[-][anonymous]12y20

First: A LessWrong seed bank. If this forest grows diseased or burns to the ground, the means to replant. Already in the LessWrong seed bank: The Sequences, FAQ, User Guide and MediaWiki.

Second: Terms of surrender. When conditions X, Y and Z are met, LessWrong will fold or reboot.

2Rhwawn12y
That's an excellent idea, but I can't think of any clear metric of success or failure, short of really unlikely ones like 'during the annual poll, LWers majority vote for astrology'.
-2Epiphany12y
This is for ideas to prevent disaster, not solve it after the fact. Also, if the suggestion is "Leave the wiki and sequences up", you're essentially saying "Do nothing". This just doesn't read like a plan.

The right bar that goes off the page is so far unexplained for me - 921 users joined in September 2011, more than three times the number in the months before and after it. If you happen to know what caused that, I would be very interested in finding out.

My prediction is something HPMOR related- either more links to lesswrong in the Author's Notes, or HPMOR itself had a spike that month.

Another way to cut down on new-new interaction is to limit the number of comments someone can make in a time period- if people can only comment once an day until their k... (read more)

0Epiphany12y
Problem: Limiting the number of posts doesn't limit the number of comments, so they'd still be able to overwhelm older users with newbie comments or create their own culture in the comments. I think this idea would be ineffective unless, by "posts" you meant "comments" (or added some similar plan for comments).
2Vaniver11y
I meant "comments" by "posts." I'll edit the grandparent to be clearer.
0Epiphany11y
Thanks, Vaniver. The OP has been updated. I also used find to see whether there were other ambiguities around the word "post" in the OP. Caught a few. (:
-2Epiphany12y
Limiting the posts would cause new users to lose momentum. A lot of them might lose steam after joining and give up. That would be risky. Also, because a large proportion would give up, this would filter users. We'd end up with a larger proportion of the type of user persistent enough to tolerate this. I don't know what that sort of person looks like. I'll add the idea to the pile, but I can't really sell it if those things aren't addressed.

Your description of the problem seems spot on to me, and most of your proposed solution sounds sensible as well. Using a questionnaire to let users graduate into the advanced section seems a little exploitable, though.

One alternative could be requiring a new user to select one of their recent comments for "admission review". If the "reviewers" agree that the comment is unusually good, they let the user in, otherwise they give some guidance on what can be improved, and let the user try again in a week or so. That may also have the side effect of improving the quality of discussion in the easy section, as users try to write comments that are good enough for "admission review".

-2Epiphany12y
Why is my questionnaire exploitable? The problem with having old users review new user's comments is: 1. We haven't verified that they'd be willing to do this. 2. If a lot of new users come in all at once, that would be a chore. This might actually scare off old users. Or create a backlog of comments to go through that prevents new users from participating. There's a high risk of this going totally dysfunctional. Unless you see something about it that I don't?
[-][anonymous]12y00

As someone that still considers themself to be a "newbie", I actually have a few thoughts on this. (I know my account is actually quite old- I discovered this site and spent a day or two on it when I was avoiding studying for a test in college, but then I forgot about it and didn't happen to rediscover the site for well over a year).

I don't really have a feel for how often you guys get new users stumbling in here and posting, but I have to say that at least for me, wandering into the discussion section in the beginning sent me running right back ... (read more)

[This comment is no longer endorsed by its author]Reply

To me it seems like we already have easy and hard sections; Discussion and Main. I'm fairly new so that may not be how others treat them. I do think it would be beneficial to dedicate some discussion areas to specialty topics. Some of the recent articles (the theoretical UDT ones for instance) would benefit from being grouped in a separate section to preserve context since related articles may be few and far between. I imagine the meetup notices would benefit as well.

If LW does implement a newbie discussion section I think it will be important to ensur... (read more)

-2Epiphany12y
As I see it, main is not a discussion area, it's a blog. If you post in main you're publishing an article. Discussions are discussions, they have standards there, but I don't think the idea is to post articles. I'm also new. I don't. That would completely defeat the purpose. The whole idea of a newbie forum is to sandbox the endless September.
0Pentashagon12y
So in essence you don't really want new users unless they didn't need the newbie forum in the first place? Maybe I'm misunderstanding you. Do you think it's beneficial to host an eternal september solely to keep it from leaking into the "important" parts of LW?
-2Epiphany12y
No, it's because anybody who comes that's willing to learn should have a chance to learn. If we prevent them from joining the regular discussions, they don't get a chance to learn. If they do join, everyone loses their chance to learn - the old members will leave due to tedium (because that's where THEY go to learn and they need to be around people who can give them a challenge in order for that to happen) and so there will be no one around to explain everything. Essentially the newbies will be in a forum of their own in any case. If they're in a forum of their own HERE though, then we can at least figure out a way to explain things to that many people at once, or people will explain as they have time for it, which is slower, but it's better than the old members leaving the site.
0Pentashagon12y
Maybe I'm too optimistic in thinking that most users could eventually migrate into the normal discussion areas. If so, then you're probably right that just containing the eternal september is the best solution.
0Epiphany12y
Well, eternal September, by definition, means many won't completely acculturate. If they all acculturate and move up, then the beginner area will be a temporary place for newbies to learn, and there was no eternal September. If there is an eternal September, then it's purpose would be to contain the eternal September. Those who acculturate would move on, but not everyone would acculturate.

921 users joined in September 2011, more than three times the number in the months before and after it. If you happen to know what caused that, I would be very interested in finding out.

Speculative: the Singularity Summit Australia 2011 was held in late August that year during the National Science Week. Then again, could be a case of post hoc, ergo propter hoc. Could be no cause other than the variance chiming in, which is to be expected from time to time.

1Rhwawn12y
What about Methods of Rationality? September 2011 is mid-way through its upswing. I see no easy way to quantify reviews, though, short of manually going through the thousands on FF.net...
0gwern11y
Actually, you might find my http://www.gwern.net/hpmor#analysis useful! Looking at all reviews posted per day, in September 2011, there does in fact seem to be a large spike in number of posted reviews.
-1Epiphany12y
Thank you Kawoomba, it sounds like an interesting theory. True that correlation is not causation, but maybe if we map other events to the numbers, we will see a pattern. (:

I definitely agree with you that we should avoid IQ testing.

I generally dislike the names "Hard" and "Easy", but I don't have anything better at the moment. Maybe "Beginner" and ... "Intermediate"? "Experienced"? I'm not sure.

Also, a minor point. About this:

Person A in a debate explains a belief but it's not well-supported. Their opponent, person B, says they're an idiot. What is this an example of?

A. Attacking the person, a great way to really nail your opponent in an argument.

B. Attacking the person,

... (read more)
2[anonymous]12y
I think it would prime people that this is something game-like.
-2Epiphany12y
Okay I will reword the question, thanks for pointing that out. I don't know what names would be best. I wonder whether the word "Advanced" will put people off and sound elitist. Using the words "Easy" and "Hard" implies that the people in the "Hard" area are choosing to challenge themselves and are putting in more effort to do so. "Beginner" implies that after a while you're supposed to go to the other forum, when really, everyone learns at their own pace and some people may just prefer for it to be easy. Maybe you don't like the words because they're a little bit cute, as if it were a game? Maybe fancier words like "Challenging" or "Difficult" would appeal more?