Less Wrong doesn't seem "overgrown" to me. It actually seems dried out and dying because the culture is so negative people don't want to post here. I believe Eliezer has talked about how whenever he posted something on LW, the comments would be full of people trying to find anything wrong with it.
Here's an example of what I think makes LessWrong unappealing. User Clarity wrote an interesting discussion level post about his mistakes as an investor/gambler and it was downvoted to oblivion. Shouldn't people be encouraged to discuss their failures as they relate to rationality? Do we really want to discourage this? No one even bothered to explain why they downvoted.
All discussion in Less Wrong 2.0 is seen explicitly as an attempt to exchange information for the purpose of reaching Aumann agreement. In order to facilitate this goal, communication must be precise. Therefore, all users agree to abide by Crocker's Rules for all communication that takes place on the website.
I think trying to impose strict new censorship rules and social control over communication is more likely to deal the death blow to this website than to help it. LessWrong really needs an injection of positive energy and purpose. In the absence of this, I expect LW to continue to decline.
Less Wrong doesn't seem "overgrown" to me. It actually seems dried out and dying because the culture is so negative people don't want to post here. I believe Eliezer has talked about how whenever he posted something on LW, the comments would be full of people trying to find anything wrong with it.
"Overgrown" was probably a bad analogy, I tried too hard to reference the idea of well-kept gardens. What I was trying to say is that there are too many hostile elements who are making this website an unwelcoming place, by unnecessary criticism, ad hominem attacks and downvotes; and that those elements should have been removed from the community earlier. I actually think we agree on this.
I think trying to impose strict new censorship rules and social control over communication is more likely to deal the death blow to this website than to help it. LessWrong really needs an injection of positive energy and purpose. In the absence of this, I expect LW to continue to decline.
OK, from reading this and other comments I accept that this was the weakest part of my post. Also, after re-reading the Wiki entry on Crocker's rule, I don't think I intended to suggest anything quite that extreme. Crocker's rules say that rudeness is acceptable simply in order to provide a precise and accurate signal of annoyance. This is certainly not what I had in mind.
I apologize for my incorrect usage of the term "Crocker's rules", and I recognize that this was probably not a good idea. I hope someone can come up with a policy that achieves the objectives I had in mind when I wrote that sentence.
I agree with this (I probably contribute a bit to the problem, I will try to do better).
edit: I think a lot of Clarity downvotes have to do w/ people not liking how that person comes across.
doesn't seem "overgrown" to me. It actually seems dried out and dying
I think both. There's more post and comment volume than I remember from "the good old days", but it's much lower quality in terms of density of interesting contributions.
because the culture is so negative people don't want to post here.
I don't agree with that as the main cause. I don't think it's a negative culture, so much as a lack of positive culture. The people doing interesting work on rationality, decisionmaking, and AI are mostly not doing it here anymore.
I could get behind most of the ideas discussed here, but I'm wary of the entire "Standards of Discourse and Policy on Mindkillers" section. It's refreshing to have a section of the internet not concerned with politics. Besides, I don't think the world is even Pareto optimized, so I don't think political discussions are even useful, since acquiring better political views incur opportunity costs. Why fight the other side to gain an inch of ground when we could do something less controversial but highly efficient at improving things? I'm all for discussing weird, counterintuitive, and neglected topics, but politics is only interesting for the same reason soap operas and dramas are interesting. The most viral arguments aren't necessarily the most worthwhile.
As for mandatory Crocker's rules, the wiki article has this to say:
Crocker emphasized, repeatedly, in Wikipedia discourse and elsewhere, that one could only adopt Crocker's rules to apply to oneself, and could not impose them on a debate or forum with participants who had not opted-in explicitly to these rules, nor use them to exclude any participant.
I suspect that if Crocker's rules were mandatory for participation in something, there would be a large number of people who would be pushed into accepting them. I don't think this would actually improve anything. Lots of people invoking Crocker's rules is a symptom of a healthy discourse, not a cause of it. Personally, I know that when I invoke Crocker's rules I have a much smaller knee-jerk reaction to criticism. LessWrong can already be a blunt place at times, probably more than is optimal.
I probably have 50 pages of random notes and thoughts that would be appropriate to post here, but haven't. Some are things I started to write specifically for LW, but never finished polishing. I suspect both the community and I would benefit from the discussion, but honestly it takes an order of magnitude more time for me to get something to a state where I would be willing to post it here. That's probably twice as much time as for me to be willing to post something on Facebook. I get diminishing returns from rereading the same thing over and over again, but it's much more difficult to achieve activation energy here. I suspect that difference is mostly due to the subjective feel of the group.
I posted some thoughts about this on Facebook, and there's some discussion here:
My most important thought is:
We should not be thinking in terms of "how can the current LW community bootstrap itself into creating a platform that'll enable good content." We should be thinking primarily in terms of "can we build something that Eliezer, Scott, etc would be actively excited to use." (Where "Eliezer/Scott" is a stand-in for "anyone who's currently creating good content in a non-centralized place)
I like some of the ideas in this post, but I think they're emphasized a little weirdly. I think the biggest obstacle we're facing is that the Popular Bloggers like to be able to talk about whatever they want, and they want control over their comments.
A lot of this post (and other attempts) seems to be thinking in terms of "how to restore LW to a sort of Platonic version of itself, where an upvote/downvote system and policy attracts and outputs quality content", and I think that approach won't actually solve the core problem.
Thanks for thinking this through.
A few questions:
Would there be a way for people who already maintain blogs elsewhere to cross-post to their LW subdomain? (Would this even be desirable?)
Do you envision LW2 continuing to include applied rationality type posts? Does that work with "everything should work towards Aumann agreement"?
users may not repeatedly bring up the same controversial discussion outside of their original context
How could we track this, other than relying on mods to be like "ugh, this poster again"?
professionally edited rationality journal
Woah. Is this really a thing that MIRI could (resources permitting) just like ... do?
Would there be a way for people who already maintain blogs elsewhere to cross-post to their LW subdomain? (Would this even be desirable?)
We would have to discuss this with people who run blogs elsewhere to find out what solutions would work for them. My preferred solution would be for people to import their old blog posts, and then redirect their domain to the LW subdomain. I do not know whether outside bloggers would find this acceptable. In some cases we may also have to consider the question of advertisement revenues.
Do you envision LW2 continuing to include applied rationality type posts? Does that work with "everything should work towards Aumann agreement"?
My apologies, I did not intend to declare that posts about applied rationality should be avoided. I guess my phrasing reveals my bias towards the "epistemic" part of this community rather than the "instrumental" side. My personal preference is to shift the community focus back towards epistemic rationality, but that is a separate discussion which I did not intend to raise here. The community should discuss this separately.
users may not repeatedly bring up the same controversial discussion outside of their original context
How could we track this, other than relying on mods to be like "ugh, this poster again"?
There would have to be some moderator discretion on this issue. My personal view is that we should err on the side of allowing most content. This language was intended for extreme cases where the community consensus is clear that a line has been crossed, such as Eugine, AdvancedAtheist or Jim Donald.
professionally edited rationality journal
Woah. Is this really a thing that MIRI could (resources permitting) just like ... do?
Yes, they can certainly do this if they have the resources. Initially, academics may not take the journal seriously and it definitely will not be indexed in academic databases. If the quality is sufficiently high, it is conceivable that this may change.
I am a huge fan of the journal idea. That would incentivise posting here instead of anywhere else and allow users more of a chance of being noticed if they are willing to put in the effort.
I suggest that any downvoting must be explain. It will help a person to understand what is wrong and improve.
However, if for some reason MIRI is unwilling to do this, and if there is sufficient interest in going in this direction, I offer to pay server costs. If necessary, I also offer to pay some limited amount for someone to develop the codebase (based on Open Source solutions).
It sounds like a situation where Kickstarter or something like that could be useful.
I sometimes semi-jokingly say that the easiest way for me to contribute to LW code would be to reimplement the whole thing in Java.
If someone would be okay with paying me to do this, I would be happy to try. Not sure if that's the best use of the money, though. Making it open-source would have the advantage that if I fail to complete the project, other people may join or replace me later.
(Why Java? First, that is the only programming language I am sufficiently fluent at. Second, Java has the advantage that the source code is already compiled and loaded in the memory when HTTP requests come, so in theory it should be fast. But I don't have an experience with measuring performance in large projects.)
(tl;dr: In this post, I make some concrete suggestions for LessWrong 2.0.)
Less Wrong 2.0
A few months ago, Vaniver posted some ideas about how to reinvigorate Less Wrong. Based on comments in that thread and based on personal discussions I have had with other members of the community, I believe there are several different views on why Less Wrong is dying. The following are among the most popular hypotheses:
(1) Pacifism has caused our previously well-kept garden to become overgrown
(2) The aversion to politics has caused a lot of interesting political discussions to move away from the website
(3) People prefer posting to their personal blogs.
With this background, I suggest the following policies for Less Wrong 2.0. This should be seen only as a starting point for discussion about the ideal way to implement a rationality forum. Most likely, some of my ideas are counterproductive. If anyone has better suggestions, please post them to the comments.
Moderation Policy:
There are four levels of users:
Personal Blogs:
Other Ideas: