Well, this is discouraging to someone who had the opposite reaction to ingres' recent survey analysis. I heard, "Try to solve the object-level problem and create content that meets the desiderata that are implicit in the survey results."
I was going to start writing about feelings-as-information theory; Kaj Sotala introduced moods as information in Avoid misinterpreting your emotions, lukeprog mentions it briefly in When Intuitions Are Useful (which Wei Dai thought might be relevant to metaphilosophy), and gwern mentions related work on processing fluency here. There are simple but interesting twists on classic, already-simple heuristics and biases experiments that everyone here's familiar with, debiasing implications, stuff about aesthetics, stuff on how we switch between Type 1 and Type 2 processing, which is relevant to the stuff lukeprog was getting into with Project guide: How IQ predicts metacognition and philosophical success, and what Kaj Sotala was getting into with his summaries of Stanovich's What Intelligence Tests Miss.
I was just about to write another post about how thinking of too many alternative outcomes to historical events can actually make hindsight bias worse, with explanations of the experimental evidence, like my most recent post. I don't know how to do more for the audience than do things like warn them about how debiasing hindsight can backfire.
And there's other stuff I could think to write about after all of that.
There are quite a number of people coordinating to fulfill the goal of revitalizing LW, and I wonder if something like this couldn't have waited. I mean, everyone just told everyone exactly what everyone's doing wrong.
I'm sorry for discouraging you. I think writing the posts you described is a great idea. I hope that if you write them, people who've read this will be more inclined to upvote them if they like them, given increased awareness of the incentives problem I described.
Another option is to pursue multiple angles of attack in parallel. My angle requires a programmer or two to volunteer their time (may as well contact Scott now if you're interested!); your angle requires people who have ideas to write them up. My guess is that these requirements don't funge against each other very much. Plus, even if the community ultimately decides to go elsewhere, I'm sure your ideas will be welcomed in that new place if you just post whatever you were going to post to LW there, and that will be a valuable kickstart.
I also agree that having people repeatedly say "LW is dying" can easily become a self-fulling prophecy. Even if LW is no longer a check-once-a-day kind of place, it can still be a perfectly fine check-once-a-week kind of place. I probably should have been more careful in my phrasing.
(I've mostly only skimmed.)
It can be hard to find good content in the diaspora. Possible solution: Weekly "diaspora roundup" posts to Less Wrong. I'm too busy to do this, but anyone else is more than welcome to (assuming both people reading LW and people in the diaspora want it).
This is what /r/RationalistDiaspora was intended to do. It never really got traction, and is basically dead now, but it still strikes me as a good solution. If that's not going to revive though, I agree that a weekly thread on LW is worth trying. By default, I'll make one later this week. (I'm not currently sure I'll have anything to post in it myself, I'll be asking people to post links in the comments.)
Go tell Scott Alexander you'll build an online forum to his specification, with SSC community feedback, to provide a better solution for his overflowing open threads.
He tried to move people to /r/SlateStarCodex, but that didn't work. We'd want to understand why. (Some hypotheses: it wasn't actually on SSC, where people go directly; posts there don't pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
As Scott features more and more posts, he gains a moderation team full of people who wrote posts that were good enough to feature.
I'm not sure that "writes good posts" and "would make a good moderator" are sufficiently correlated for this to work. A lot of people like Eliezer's writing but dislike his approach to moderation.
(On the other hand: maybe, if we want Eliezers to stick around, we need them to be able to shape the community? Even if that means upsetting people who don't write much.)
It also creates weird incentives, like: "I liked this post that was highly critical of our community, but I don't want the author to be a mod". (This is the problem that Scott Aa points to of "this system can only improve on ordinary democracy if the trust network has some other purpose" - I worry that voting-for-comment-scores isn't a sufficiently strong purpose to outweigh voting-for-moderators.)
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others', they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
He tried to move people to /r/SlateStarCodex, but that didn't work.
He didn't really try. All he did was mention offhand a couple of times that if people are unhappy with how the comment section works, there is the subreddit and it looks reasonable to him.
It would not be hard for Scott to move people to subreddit: put a link to it at the end of each article + just go there and respond to comments in the subreddit.
He tried to move people to /r/SlateStarCodex, but that didn't work. We'd want to understand why. (Some hypotheses: it wasn't actually on SSC, where people go directly; posts there don't pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
I think a big explanation is that /r/SlateStarCodex was not advertised sufficiently, and people never developed the habit of visiting there. I imagine that if Scott chose to highlight great comments or self posts from /r/SlateStarCodex each week, the subreddit would grow faster, for instance.
Online communities are Schelling points. People want to be readers in the community where all the writers are, and vice versa. Force of habit keeps people visiting the same places over and over again, but if they don't feel reinforced through interesting content to read / recognition of their writing, they're liable to go elsewhere. The most likely explanation for why any online community fails, including stuff like /r/RationalistDiaspora and /r/SlateStarCodex, is that it never becomes a Schelling point. My explanation for why LW has lost traffic: there was a feedback loop involving people not being reinforced for writing and LW gradually losing its strength as a Schelling point.
Edit: also, subreddits are better suited to link sharing than original posts IMO.
I'm not sure that "writes good posts" and "would make a good moderator" are sufficiently correlated for this to work. A lot of people like Eliezer's writing but dislike his approach to moderation.
Acknowledged, but as long as the correlation is above 0, I suspect it's a better system than what reddit has, where ability to vote is based on possession of a warm body.
It also creates weird incentives, like: "I liked this post that was highly critical of our community, but I don't want the author to be a mod".
Concrete example: Holden Karnofsky's critical post was liked by many people. Holden has posted other stuff too, and his karma is 3689. That would give him about 1% of Eliezer's influence, 4% of Yvain's influence, or 39% of my influence. This doesn't sound upsetting to me and I doubt it would upset many others. If Holden was able to, say, collect mucho karma by writing highly upvoted rebuttals of every individual sequence post, then maybe he should be the new LW moderator-in-chief.
But even if you're sure this is a problem, it'd be simple to add another upvote option that increases visibility without bestowing karma. I deliberately kept my proposal simple because I didn't want to take away the fun of hashing out details from other people :) I'm in favor of giving Scott Alexander "god status" (ability to edit the karma for every person and post) until all the incentive details are worked out, and maybe even after that. In the extreme, the system I describe is simply a tool to lighten Scott's moderation load.
(This is the problem that Scott Aa points to of "this system can only improve on ordinary democracy if the trust network has some other purpose" - I worry that voting-for-comment-scores isn't a sufficiently strong purpose to outweigh voting-for-moderators.)
So I guess the analogy here would be if I want a particular user to have more influence, I'd vote up a post of theirs that I didn't think was very good in order to give them that influence? I guess this is a problem that would need to be dealt with. Some quick thoughts on solutions: Anonymize posts before they're voted on. Give Scott the ability to "punish" everyone who voted up a particularly bad post and lessen their moderation abilities.
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others', they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
A related idea that might work better: Make it so downvotes work to decrease the karma score of everyone who upvoted a particular thing. This incentivizes upvoting things that people won't find upsetting, which works against the sort of controversy the rest of the internet incentivizes. But there's no Keynesian beauty contest because you can never gain points through upvoting, only lose them. This also creates the possibility that there will be a cost associated with upvoting a thing, which makes karma a bit more like currency (not necessarily a bad thing).
The Less Wrong diaspora demonstrates that the toughest competition for online forums may be individual personal blogs. By writing on your personal blog, you build up you own status & online presence. To be more competitive with personal blogs, it might make sense to give high-karma users of a hypothetical SSC forum the ability to upvote their own posts multiple times, in addition to those of others. That way if I have a solid history of making quality contributions, I'd also have the ability to upvote a new post of mine multiple times if it was an idea I really wanted to see get out there, in the same way a person with a widely read personal blog has the ability to really get an idea out there. The mechanism I outlined above (downvotes taking away karma from the people who upvoted a thing) could prevent abuse of self-upvoting: if I self-upvote my own post massively, but it turns out to be lousy, other people will downvote it, and I'll lose some of the karma that gave me the ability to self-upvote massively.
If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
StackExchange uses a flag weight model. They removed it from the visible section of the profile (http://meta.stackexchange.com/questions/119715/what-happened-to-flag-weight) but I think they still use it internally.
I'm writing this from Less Wrong 2.0.
If you've got a great idea for a blog post, and you don't already have an online presence, it's a bit hard to reach lots of people, if that's what you want to do.
I don't know what Less Wrong 1.0 was like but I feel like Less Wrong 2.0 accomplishes this.
If we had a good system for incentivizing people to write great stuff (as opposed to merely tolerating great stuff the way LW culture historically has), we'd get more great stuff written.
Once again, I don't know what Less Wrong 1.0 was like, but I think Less Wrong 2.0 does a good job of this without incentivizing too much.
Excellent post. Agree with all major points.
I think Less Wrong experienced the reverse of the evaporative cooling EY feared, where people gradually left the arena as the proportional number of critics in the stands grew ever larger.
I'd think it was primarily not the proportional number of critics, but lower quality of criticism and great users getting tired of replying to/downvoting it. Most of the old crowd of lesswrongers welcomed well thought out criticism, but when people on the other side of an inferential distance gap try to imitate those high-criticism norms it is annoying to deal with, so they end up leaving. Especially if the lower quality users are loud and more willing to use downvotes as punishment for things they don't understand.
In terms of low quality criticism, lot of people keep trying to fight the hypothetical. That gets tiring rather quickly.
Try to solve the same sort of problems Arbital or Metaculus is optimizing for. No reason to step on the toes of other projects in the community.
I don't think that it's bad to have multiple website that gather predictions. It's good to have different websites trying different approaches.
Metaculus has curated questions. Users can suggest new questions but they have to get chosen. Predictionbook allows user suggested questions. GJOpen is completly curated without much ways of users to suggest question (expect a bit, see the recent question for new questions) Metaculus starts by showing the user the average guesses of the community before the user votes.
Metaculus let's the user pick probabilities with a sliding scale. Predictionbook lets the user input a number. GJOpen let's the user click two times (you click first on 10-19 and then on 14).
All three try to score users differently.
I would welcome more experimentation.
Hey, everyone! Author of rationalfiction.io here.
I am actively building and improving our website, and I would be happy to offer it as a new platform for LW community, if there's interest.
I can take care of the hosting, and build all the necessary features.
I've been thinking about creating a LW-like website for a while now, but I wasn't sure that it will work. After reading this post I have decided that I'm going launch and see where it goes.
If there's any ideas or suggestions about how such platform can be improved or what features we'll need - let's discuss them.
By the way, the platform is open source(though I will probably fork it as a separate project and develop it in a new repo).
It's hard to know for sure, since previous surveys were only advertised on the LessWrong.com domain, but it doesn't seem like the diaspora thing has slowed the growth of the community a ton and it may have dramatically accelerated it.
My impression is that a big part of the increase in survey responses was that this was explicitly advertised as a diaspora survey, with a number of people saying things like "if you're reading this, it's for you."
Nice ideas! I think you highlighted well the fundamental problem of lack of social rewards for writing content for LW, and having strong criticism for doing so.
Regarding changing things, I think it makes sense to work with people like Scott who have a lot of credibility, and figure out what would work for them.
However, it also seems that LW itself has a certain brand, and attracts a sizable community. I would like to see a version of the voting system you described implemented here, with people who have more karma having votes that weigh more. I'd also like to see some cross-posting of content from Scott and others on LW itself.
So not doing away with LW as it exists, but expanding it in collaboration with others who would be interested in revitalizing a different form of LW. One where authors get appropriate credit for posting, with credible people - those who have lots of karma - being able to upvote them more.
I am honored that my survey writeup produced this level of quality discussion and endorse this post.
(Though not necessarily its proposed upvote scheme, sounds kind of flaky to me. I'm personally very skeptical of upvotes and community curation as cure-alls.)
I am honored that my survey writeup produced this level of quality discussion and endorse this post.
Thanks!
(Though not necessarily its proposed upvote scheme, sounds kind of flaky to me. I'm personally very skeptical of upvotes and community curation as cure-alls.)
What's your favored solution?
The incentives are currently a major problem. Looking at my posting history, my most upvoted posts are all light and fluffy things.
Try to tackle a difficult and controversial problem and you'll find it very hard to get upvotes, because these will mostly be balanced out by downvotes from people who strongly believe in the opposite.
How much are you motivated by votes in the first place, though? I care a bit that a comment got any reaction, and a bit that it's positive, but I give a lot more weight to followups and responses than to votes. And I especially don't care about vote magnitude. 2 or 3 is as good as 10 or 12 to my happiness-at-posting response.
I think I probably care less about votes than the average person, since I appreciate feedback. I suspect that many people who received as many critical comments and downvotes as I have would give up on posting on the forum (not that many of most posts are negative, but I've often got posts downvoted into the negative, then voted up by other people).
MinibearRex did this a while back, and the Rationality Reading Group recently finished RAZ. I weakly suspect that we're better off doing this with something besides the Sequences, like Superintelligence or Good and Real or so on.
(But if you want to do Sequence reposts, go ahead.)
I don't know if it was in the comments of here or an SSC post, but when talking about the rationalist diaspora and where the community goes from here, Scott has said he would welcome blog posts from guest authors, and mentioned several people he'd be willing to have on the site, or had already invited to make a guest post. Naturally, the guest authors he mentioned were already once-prominent LW bloggers--I forget who he mentioned besides Eliezer, who declined, but there were almost a dozen others. Scott's writing is so impressive I wouldn't be surprised if even some of his close friends, our friends, who hundreds of us think are often writers just as good or sometimes better than Scott, are personally too intimidated to post on SSC.
Well, that's one hypothesis. Sometimes posts are so top-notch on SSC guest authors might feel they're not up to snuff. Another hypothesis is that, for prominent authors, LW was a forum which exhausted all the low-hanging fruit, and wasn't receptive to juicier, edgier, topics, like the culture and politics Scott writes about. However, with that level of exposure on a personal blog, writing on more controversial topics, earns a lot more scrutiny. SSC isn't without its share of contentious posts. Maybe all that politicking, having the patience to grit your teeth and exercise the principle of charity in the face of hundreds of commentators, is a skill Scott has that's harder for the rest of us to hack. Maybe other diaspora authors know this, and don't think they can thrust themselves into the spotlight without getting burned out.
I think if the LW/rationalist community made an effort to publicly laud the authors of various blogs we like, they'd feel like it's more worth the effort if respected readers want more. I don't know if that'll work. However, I know that if there was a thread were dozens of people were commenting that they liked my blog, even though they usually didn't speak up about it, and that each of those comments had dozens of upvotes or whatnot, I'd be more inclined to write.
I'm aware Ozy Frantz had one or two guest posts on SSC as well, and that sometimes they and Scott had a dynamic where they were sometimes responding to posts on each others' blogs, and that was interesting. AFAIK, that was mostly while they were dating. I don't know what the status of Ozy ever being a guest author on SSC, or not, is.
I believe that you'll need to attend a CFAR workshop ($3,900 without a scholarship) to receive a subscription to the CFAR mailing list. I'd be willing to pay some amount just to get added to it, since I already have a CFAR workbook, and am relatively familiar with the material taught during the workshops.
It's not public. It's an alumni mailing list. By only opening up the mailing list to people who were at CFAR, CFAR manages to create a selected cycle of people with shared vocabulary.
Two questions:
Can anyone who is a user for a significant amount of time give links to anything that wasn't deemed worhy of the sequences but is a worthy read? I have no idea when the sequences were collected but if LW was really great in the past, there would've been a bunch of other high-quality posts that are easily missed. This could also double as proof that LW was, indeed, as great as advertised.
What do other places have that LW doesn't? If LW is dedicated to human rationality, is it truly doing that?
Am I a complete dumbass for typing this? In hindsight, it doesn't take a special variation of Godwin's law to think 'someone probably posted a similar question before'.
\1. There have been several posters who wrote some very nice articles, including Alicorn, lukeprog, Yvain, AnnaSalamon, and Wei_Dai. (Listed in order on a sort of life-hacks to decision-theory spectrum).
Oh, and here's a classic by that prolific author, anonymous (Who? That would be telling :) )
\2. To be uncharitable, we might say that other places have way more discussions of race, politics, and gender. Or to be uncontroversial, we might just say that other places have a lot more ordinary blog-type content, which people read for ordinary blog-type reasons.
A lot of diasporae I like most (E.g. Otium, Paul Christiano's medium) don't have such content, and are correspondingly unpopular.
\3. On question 1, there are definitely index posts aimed at this sort of thing, but I couldn't find the specific one I Was thinking of with just a cursory search.
This is a response to ingres' recent post sharing Less Wrong survey results. If you haven't read & upvoted it, I strongly encourage you to--they've done a fabulous job of collecting and presenting data about the state of the community.
So, there's a bit of a contradiction in the survey results. On the one hand, people say the community needs to do more scholarship, be more rigorous, be more practical, be more humble. On the other hand, not much is getting posted, and it seems like raising the bar will only exacerbate that problem.
I did a query against the survey database to find the complaints of top Less Wrong contributors and figure out how best to serve their needs. (Note: it's a bit hard to read the comments because some of them should start with "the community needs more" or "the community needs less", but adding that info would have meant constructing a much more complicated query.) One user wrote:
ingres emphasizes that in order to revitalize the community, we would need more content. Content is important, but incentives for producing content might be even more important. Social status may be the incentive humans respond most strongly to. Right now, from a social status perspective, the expected value of creating a new Less Wrong post doesn't feel very high. Partially because many LW posts are getting downvotes and critical comments, so my System 1 says my posts might as well. And partially because the Less Wrong brand is weak enough that I don't expect associating myself with it will boost my social status.
When Less Wrong was founded, the primary failure mode guarded against was Eternal September. If Eternal September represents a sort of digital populism, Less Wrong was attempting a sort of digital elitism. My perception is that elitism isn't working because the benefits of joining the elite are too small and the costs are too large. Teddy Roosevelt talked about the man in the arena--I think Less Wrong experienced the reverse of the evaporative cooling EY feared, where people gradually left the arena as the proportional number of critics in the stands grew ever larger.
Given where Less Wrong is at, however, I suspect the goal of revitalizing Less Wrong represents a lost purpose.
ingres' survey received a total of 3083 responses. Not only is that about twice the number we got in the last survey in 2014, it's about twice the number we got in 2013, 2012, and 2011 (though much bigger than the first survey in 2009). It's hard to know for sure, since previous surveys were only advertised on the LessWrong.com domain, but it doesn't seem like the diaspora thing has slowed the growth of the community a ton and it may have dramatically accelerated it.
Why has the community continued growing? Here's one possibility. Maybe Less Wrong has been replaced by superior alternatives.
Less Wrong had a great run, and the superior alternatives wouldn't exist in their current form without it. (LW was easily the most common way people heard about EA in 2014, for instance, although sampling effects may have distorted that estimate.) But that doesn't mean it's the best option going forward.
Therefore, here are some things I don't think we should do:
But that doesn't mean there's nothing to be done. Here are some possible weaknesses I see with our current setup:
ingres mentions the possibility of Scott Alexander somehow opening up SlateStarCodex to other contributors. This seems like a clearly superior alternative to revitalizing Less Wrong, if Scott is down for it:
But the most important reasons may be behavioral reasons. SSC has more traffic--people are in the habit of visiting there, not here. And the posting habits people have acquired there seem more conducive to community. Changing habits is hard.
As ingres writes, revitalizing Less Wrong is probably about as difficult as creating a new site from scratch, and I think creating a new site from scratch for Scott is a superior alternative for the reasons I gave.
So if there's anyone who's interested in improving Less Wrong, here's my humble recommendation: Go tell Scott Alexander you'll build an online forum to his specification, with SSC community feedback, to provide a better solution for his overflowing open threads. Once you've solved that problem, keep making improvements and subfora so your forum becomes the best available alternative for more and more use cases.
And here's my humble suggestion for what an SSC forum could look like:
As I mentioned above, Eternal September is analogous to a sort of digital populism. The major social media sites often have a "mob rule" culture to them, and people are increasingly seeing the disadvantages of this model. Less Wrong tried to achieve digital elitism and it didn't work well in the long run, but that doesn't mean it's impossible. Edge.org has found a model for digital elitism that works. There may be other workable models out there. A workable model could even turn in to a successful company. Fight the hot new thing by becoming the hot new thing.
My proposal is based on the idea of eigendemocracy. (Recommended that you read the link before continuing--eigendemocracy is cool.) In eigendemocracy, your trust score is a composite rating of what trusted people think of you. (It sounds like infinite recursion, but it can be resolved using linear algebra.)
Eigendemocracy is a complicated idea, but a simple way to get most of the way there would be to have a forum where having lots of karma gives you the ability to upvote multiple times. How would this work? Let's say Scott starts with 5 karma and everyone else starts with 0 karma. Each point of karma gives you the ability to upvote once a day. Let's say it takes 5 upvotes for a post to get featured on the sidebar of Scott's blog. If Scott wants to feature a post on the sidebar of his blog, he upvotes it 5 times, netting the person who wrote it 1 karma. As Scott features more and more posts, he gains a moderation team full of people who wrote posts that were good enough to feature. As they feature posts in turn, they generate more co-moderators.
Why do I like this solution?
TL;DR - Despite appearances, the Less Wrong community is actually doing great. Any successor to Less Wrong should try to offer compelling advantages over options that are already available.