A year and half later, I see very few LW posts that follow the advice offered in this post. Do people think it's a problem that we rarely respond to posts at other rationalist blogs? Should we try to do it more often?
Ha, I saw this comment and subsequently this post accidentally while searching through your submissions to find your post on metaphilosophical mysteries to link to in a comment on Robin's post about the meaning of the meaning of life.
But anyway, is there a list anywhere of other rationalist blogs/forums? Maybe if such a list were created then people would add such blogs into their RSS feeds, or participate in the forums (though the latter is less likely). Michael Anissimov's blog, Overcoming Bias, GNXP, and Roissy's blog are the other rationalist blogs in my feed at the moment. Others, like your own, Steven0461's, and Philosophy Etc., don't seem to update very often. The Everything List seems pretty much dead or irrelevant. Two blogs that come to mind that I should subscribe to are Scott Aaronson's and Shane Legg's. Marginal Revolution isn't my thing but I'm sure others here would enjoy it. Same with Practical Ethics. Do you know of others?
I notice we tend not to interact with the New Atheists or places like RationalWiki, mostly because there's not much to learn from them. I came here from RW and when I visit there from time to time there's really nothing interesting any more. The emphasis on reversed stupidity gets old. (But it's cool that they're way chiller and less pretentious.) Still, perhaps we could at least spend some effort evangelizing in such forums? If so, I heavily recommend against bringing up cryonics, the Singularity, et cetera, but I think that would be the average Less Wronger's natural reaction because it is a case study in when naive traditional rationality and scientific majoritarianism fail to find the most relevant questions. The whole thing is difficult to navigate.
Various people on our blogs have talked about how useful a whuffie concept would be (see especially Vassar on reputation markets. I agree that Less Wrong's karma scores encourage an inward focus; however, the general concept seems so useful that we ought to consider finding a way to expand Karma scores beyond just this site, as opposed to shelving the idea. Whether that is best implemented through facebook or some other means is unclear to me. Can anyone link to any analysis on this?
So on a related point that may or may not be worth its own post. Looking at the new Less Wrong facebook group one rapidly becomes aware that basically everyone here is demographically identical. The vast majority are white men in their twenties- and among those who volunteered the information, most had degrees in math, science or philosophy. There did appear to be a large international presence (and by international I mean European).
So my question is 1) why? What about the Less Wrong project selects YWM and 2) is it a problem? I tend to think that someone biography influences their perspective to such an extent that its useful to talk to and read people with different biographical backgrounds. So maybe its just a matter of reading different blogs... on the other hand if you're trying to build a broad rationalist movement then we're doing something wrong, no?
With regard to school degrees, I would expect that the most popular fields of study for LW folks would be cognitive science, other sciences, mathematics, and philosophy, so that's no surprise. It has been my experience that people who are interested in rationality as a subject are generally interested in most or all of the above, and that people not interested in most or all of the above are generally not interested in rationality as a general topic.
I suspect that you'll find computer science (my own field of study) is very highly represented. Maybe even a majority. That's because within our field, any false beliefs and sloppy thinking eventually surface in the form of bugs; we spend hundreds of hours searching for and fixing these problems, which is applied rationality in pure form, with feedback. All programmers are forced to get in the habit of questioning their beliefs, and for us, improving cognitive skills has a very short payback period.
That's because within our field, any false beliefs and sloppy thinking eventually surface in the form of bugs...
Flattering theory, but it doesn't explain the abundance of programmers on Reddit where no one tries to overcome bias. I have a simpler theory: we programmers have a habit of surfing the internet looking for intellectual junk food.
You have to make a distinction between programmers in general and great programmers (or aspiring great programmers). The average programmer cares about bias no more (or little more) than the average person. But if you love programming and desire to become a jedi programmer, you absolutely have to get in the habit of being reflective about the underlying biases that were responsible for bugs, poor design decisions, etc., and figure out how the jedi are able to do what they do.
Read some interviews with (or writings of) great programmers -- people like Jon Bentley, Bill Joy, etc.: you'll see that they have great insight into cognitive biases that have to be overcome to become a great programmer, and they have found ways to overcome those biases.
Of course. I don't know how I forgot to include computer science, but you could argue that it is included in "mathematics and the sciences". Comp sci would probably be the largest group, both because of the large overlap between AI and rationality as well as the habit of questioning beliefs that good programmers engage in.
Agreed. I hold an undergraduate degree in computer science as well, and I'm quite certain that Eliezer has at minimum a graduate student's level of knowledge in the field.
Of course, computer science is arguably either a subset of applied mathematics, or an intersection of mathematics and engineering.
The question of youth seems easy, and I would expect LW readers to skew young. LW is heavily focused on a form of self-improvement that requires serious investment of effort and willingness to challenge preconceived notions. Youth is on average more open to new experience, is often engaged in deep questioning of their worldview already, has lots of time available, and is more likely to have the discipline required for intellectual reading/learning, since they're probably doing it at school/university already anyway. Most people do not like learning, and apart from learning related to their jobs, have no interest in continuing to learn after their formal education ends.
Not tackling the question of LW WM-ness for now, but this has been discussed before over at OB. I'd be curious if there's a significant difference between the proportion of WMs among LW readers and the proportion of WMs among the obvious feeder disciplines though...
I wonder whether this means that we're missing out on a lot of potential expertise though. (I'm thinking particularly of academics here, so discipline, interest etc. are assured.)
On the other hand, there's a chance that the youth skew is partially a function of the facebook side of the facebook/LW intersection...
I would like to see more people who practice rationality and assumption questioning in other disciplices: women's studies, public policy, art and literature. I took a lot of literary philosophy classes back in the day and read quite a few post-modern critiques that mirror what I see on Less Wrong.
Almost every post-modern analysis depends on questioning how someone framed their subject and proceeds to recommend different assumptions; surely people with these backgrounds have examples to offer outside of game theory and psychology.
It would also be good to see some legal types. Lawyers competing in front of Judges who then make decisions that affect people's lives must certainly have put a little thought toward the roles of rationality and persuasion in truth seeking. Even if you don't care for lawyers, you have to wonder how judges proceed.
Maybe we should invade other forums and lead the discussions back here?
EDIT ( In regard to that OB post on female perspectives, its interesting that Robin Hanson of all people wasn't more humble about his potential lack of knowledge in a new field when his post got a poor response! Goes to show how important other perspectives are to this project)
I'll read some OB and LW articles to my mom and hear why she's not interested in rationality.
Though, I can predict her response already: "You know how emotional I am! And you know I can't help it."
Robin's point about karma is worth exploring. Yes, votes help to filter comments and the modest score required to post makes sense. But what is the purpose of tracking very large totals? (Eliezer's doesn't even fit inside his little green circle!) This creates a competition and plays to emotional reinforcement mechanisms. It also can be intimidating to see for passers-by or would-be LW contributors.
It creates competition, yes, but that's not necessarily a bad thing. Personally I've found karma to be a great motivator in trying to write good comments and posts. For me, the top ten list is a considerable motivator - trying to achieve it has been part of the reason why I've been as active as I have been. I probably won't be able to reach it - there are other people who are consistently even more productive - but even then I try not to end up too far from the lowest-ranking people on it. I even think it might be good if people could bring up a list of all contributors as ranked by karma, not just the top ten.
Karma does create an incentive to write more and better comments. Still, the question is what alternative are you foregoing to write the marginal comment at LW? Should the top ten competition skew your investment of time toward LW comments and away from that alternative? Is it rational if it does?
Yes, I'm beginning to wonder how useful tracking karma is. Every time I've found myself referring to it, it has been for status purposes. Maybe a set of titles, ie Beginner (0-100), Intermediate (100-500), Master (500+), rather than a straight number, would be worthwhile.
I also wonder if hiding the total on comments (except an indicator for a negative score) would make votes more honest.
It would, but it would also make them less useful. What's the point of having a more accurate measure if it's at the cost of hiding it?
When I don't have a lot of time, I just skim through the comments and read those with a high score. I'd like to be able to keep doing that.
I worry a little that Less Wrong karma score incentives may encourage an inward focus, since karma is so far only scored for internal site activity.
We could incorporate a reddit-like system for rating relevant off-site posts and linking to them. This would also encourage people to respond to LW, as they'd get a link + traffic if they wrote a good post.
An insular and hidebound community falls into the class of problems a young site hopes it's lucky enough to be faced with. I'm a ~1 year OB reader, and while I never claim to represent these sites, I do correct unambiguous biases where the readership seems receptive, and direct people here who seem particularly ready (the last was a Randi conference-goer; another young white male working on a postgrad law degree--but I'm good at bringing girls to my martial arts class, so I'll try that here).
It is about the same. You can check that yourself by clicking on the number box (now reading 2,666,173) on the right hand column.
It has held steady and may be slightly improving, actually as people who are interested in Robin's ideas are slowly beginning to come back.
As a regular reader of overcomingbias.com (through RSS) I honestly had never really noticed lesswrong.com yet, until Eliezer repeatedly posted about it recently. I do remember talk about reducing the number of posts on overcomingbias.
Overcomingbias.com is wonderful, because it is unique, perhaps, and because the contributors have lots of interesting stuff to say. Apart from that and in spite of the unambiguous title, it`s identity doesn't seem overly clear. It doesn't always seem overly focused on actually, practically overcoming bias.
That's perhaps where lesswrong has a role to play: as an exercise by an actual community in overcoming biases, being less wrong, in "real life". For now though it isn't clear to me yet where this new site is headed. The people involved are obviously interested in the same issue that are discussed on overcomingbias, especially those of the Eliezer variety, so is my first impression.
Karma scores are doubtless a powerful force, and I have no doubt that they will do exactly what Robin suggests that they will. I don't think you can have one without the other, unless you find a way to reward karma for external activity. However I strongly agree with khafra that it is very much a happy problem at this stage to have some people a little too concerned with their karma scores. The bias towards doing what can be scored seems likely to work for us in at least the medium term.
There's a similar idea in web design (which, I think, is not merely a coincidence) - users spend most of their time on other people's websites. This entails that one must design one's site in a way that is intuitive relative to the designs of other people's sites.
Similarly, the use of closed-garden techniques, like an internal karma score and excessive jargon and internal references, will alienate folks who don't want to specifically 'be a member of this community' - which should not be all that this site is about.
I think OB has improved since LW started up. OB now feels calmer and it' better paced.
Most healthy intellectual blogs/forums participate in conversations among larger communities of blogs and forums. Rather than just "preaching to a choir" of readers, such blogs often quote and respond to posts on other blogs. Such responses sometimes support, and sometimes criticize, but either way can contribute to a healthy conversation.
If folks at Less Wrong saw themselves as a part of a larger community of rationalists, they would realize that most rationalist authors and readers are not at Less Wrong. To participate in a healthy conversation among the wider community of rationalists, they would often respond to posts at other sites, and expect other sites to respond often to them. In contrast, an insular group defined by something other than its rationality would be internally focused, rarely participating in such larger conversations.
Today at Overcoming Bias I respond to a post by Eliezer here at Less Wrong. Though I post occasionally here at Less Wrong, I will continue to post primarily at Overcoming Bias. I consider myself part of a larger rationalist community, and will continue to riff off relevant posts here and elsewhere. I hope you will continue to see me as a part of your relevant world.
I worry a little that Less Wrong karma score incentives may encourage an inward focus, since karma is so far only scored for internal site activity.