I don't want LW to change in that direction.
In the famous talk "You and Your Research", Richard Hamming explained why physicists don't spend much time on researching antigravity:
The three outstanding problems in physics, in a certain sense, were never worked on while I was at Bell Labs. By important I mean guaranteed a Nobel Prize and any sum of money you want to mention. We didn't work on (1) time travel, (2) teleportation, and (3) antigravity. They are not important problems because we do not have an attack. It's not the consequence that makes a problem important, it is that you have a reasonable attack.
We can talk productively here about topics like decision theory because we have an attack, a small foothold of sanity (established mostly by Eliezer and Wei) that gives us a firm footing to expand our understanding. As far as I can see, we have no such footholds in politics, or gender relations, or most of those other important topics you listed. I've been here for a long time and know that most of our interminable "discussions" of these controversial topics have been completely useless. Our rationality helps us maintain a civil tone, but not actually, you...
We can talk productively here about topics like decision theory because we have an attack, a small foothold of sanity (established mostly by Eliezer and Wei) that gives us a firm footing to expand our understanding. As far as I can see, we have no such footholds in politics, or gender relations, or most of those other important topics you listed. I've been here for a long time and know that all our interminable "discussions" of these controversial topics have been completely useless.
Therefore posts on such subjects should be made if and when such an attack is found? I would support that standard.
cousin_it:
We can talk productively here about topics like decision theory because we have an attack, a small foothold of sanity (established mostly by Eliezer and Wei) that gives us a firm footing to expand our understanding. As far as I can see, we have no such footholds in politics, or gender relations, or most of those other important topics you listed.
To me, this sounds way too ambitious for a place that advertises itself as a public forum, where random visitors are invited with kind words to join and participate, and get upvoted as long as they don't write anything outright stupid or bad-mannered.
You're correct about the reasons why physicists don't work on on anti-gravity, but you'll also notice that they don't work by opening web forums to invite ideas and contributions from the general public. A community focusing strictly on hard scientific and mathematical progress must set the bar for being a contributor way higher, so high that well over 90% of the present rate of activity on this website would have to be culled, in terms of both the number of contributors and the amount of content being generated. At that point, you might as well just open an invitation-only mail...
In contrast, the present state of knowledge in softer fields is so abysmally bad, and contaminated with so much bias and outright intellectual incompetence, that a group of smart and unbiased amateurs can easily reach insight beyond what's readily available from reputable mainstream sources about a great variety of issues.
I'm afraid that if we accept this suggestion, most posts about softer fields will consist of seemingly plausible but wrong contrarian ideas, and since most of us won't be experts in the relevant fields, it will take a lot of time and effort for us to come up with the necessary evidence to show that the ideas are wrong.
And if we do manage to identify some correct contrarian insight, it will have minimal impact on society at large, because nobody outside of LW will believe that a group of smart and unbiased amateurs can easily reach such insight.
The comment definitely wasn't well-worded if it seems like there's a contradiction there; in fact, my failure to convey the point suggests that the wording was quite awful. (Thus providing more evidence that people are way too generous with upvoting.) So please let me try once more.
I was trying to draw a contrast between the following:
Topics in math and hard science, in which any insight that can't be found by looking up the existing literature is extremely hard to come by. It seems to me that a public web forum that invites random visitors to participate freely is, as a community, inherently unusable for achieving any such goal. What is required is a closely-knit group of dedicated researchers that imposes extremely high qualifications for joining and whose internal discussions will be largely incomprehensible to outsiders, the only exception being the work of lone geniuses.
Topics in softer fields, in which the present state of knowledge is not in the form of well-organized literature that is almost fully sound and extremely hard to improve on, but instead even the very basics are heavily muddled and biased. Here, in contrast, there is plenty of opportunity to achieve some
My concern about repealing the ban on discussing politics is that it risks deteriorating the quality of the discussion here - not only by increasing the risk of flame wars, but more subtly by making other people aware of who they agree with or disagree with on politics, which might make some other discussions more polarized too.
Right now, while browsing a discussion here, I generally have no idea about the politics of those involved, and I kinda like that. If there was a lot of discussion on more divisive subjects, my judgement might be more clouded - "oh, he's an asshole anyway, I won't pay attention to him".
Maybe I'm being too pessimistic about the capacity of the average LessWrong to be be honest with himself and others about his bias.
I would be in favour of opening a separate forum, where LessWrongers log in with different usernames but try to stick to LessWrong conventions. That would avoid the "reputation bleedback" to the main LessWrong, and it makes it easier to cut the limb off if it shows signs of gangrene.
Right now, while browsing a discussion here, I generally have no idea about the politics of those involved, and I kinda like that. If there was a lot of discussion on more divisive subjects, my judgement might be more clouded - "oh, he's an asshole anyway, I won't pay attention to him".
Upvoted for this.
Less Wrong has the highest accuracy and signal to noise ratio of any blog I've seen, other than those that limit themselves to narrow specialties
This is an argument for being very protective of the s/n ratio here, and to "confront wrongness wherever it appears" is the opposite of being protective.
I recommend our delaying the ambitious worthwhile goal you advocate till we have better mechanisms for protecting the quality of a conversation.
Several years ago, I used to go to predominantly Christian conservative forums and get in arguments. It was mostly for entertainment, but it was also kind of an eye-opening experience. I did the same sort of thing, in different forums, for nuclear power. Most big controversial issues are so steeped in the Dark Arts that you'll just get overshouted if you try to calmly lay out a reasonable and well-organized argument. As the saying goes, they'll drag you down to their level and beat you with experience.
It's possible to be frightfully effective in such an argument, but the methods you have to use feel pretty dirty, because they are. If you can force people to defend indefensible implications of their position, you score points. If you can trick a bunch of opponents into saying something obviously stupid, you score points. If you can find a loud idiot and make a fool of them in public, so that the lurkers don't want to be associated with him, you score big. And often pure typing speed helps; if you're answering the same arguments again and again, you can fire off a lot of replies very fast, flying on autopilot. A handful of vocal people using such tactics can change the default views ...
There are three things, as I see it, that LW could be, and posts seem to revolve around one of these three categories.
An all-purpose discussion forum, about all sorts of things, designed for people with a roughly technical bent who like a high quality of discussion. Topics that seem to draw a lot of attention are non-mainstream controversies, philosophy, politics, and self-help.
A blog about bias, rationality, and how to exercise more of the latter and less of the former. Topics can be technical (drawing on cognitive sciences and probability theory) or non-technical (illustrations of common fallacies and how to evade them.)
A blog about decision theory, FAI, and existential risk.
Personally, I would enjoy 1 or 2, and I'd be less interested in 3. I wouldn't mind LW going in a looser, more casual direction; alternatively, I wouldn't mind it going in a more focused, technical direction. I am interested in the intersection between, roughly, statistics and cognition, and I think that we could get a lot of interesting speculation done about how people think. I'd probably quit if LW became all decision theory, all the time, because that's not really one of my interests, but I c...
I would want to go even further, and strike out (perceived) "importance" as a barrier. Thinking in terms of "importance" will tend to cause our minds to stay within certain topic clusters, when what we actually want is more variety of topics. Rationality lessons are often most illuminating when applied in situations we don't stereotypically think of as illustrating rationality lessons. People may have pet topics or specialized areas of expertise that they would like to post on, but don't because of a fear that their subject isn't "important enough" (which in practice tends to mean being about the topics most commonly discussed here). This is unfortunate, because rationality literally applies everywhere; and I think an aspiring rationalist should seek out as many diverse opportunities for honing their general rationality skills as possible. This will prove useful when it comes to the "important" topics.
On the other hand,
These have already been discussed so they would be discouraged as duplicates rule (except for substantially new approaches),
I actually wouldn't want to restrict duplicates to new approaches to the subject itself; I think ...
This is unfortunate, because rationality literally applies everywhere; and I think an aspiring rationalist should seek out as many diverse opportunities for honing their general rationality skills as possible.
I would like to know how widely agreed-on this attitude is on LW; I have been specifically resisting writing posts about random things which are of interest to me but don't correlate to any of the site's major common themes (rationality in the abstract, AI, health, philosophy). I'd be happy to write posts applying rational principles to everyday circumstances, but I'd want a stronger signal that that the community would appreciate it first.
While I am generally impressed with the level of rationality in discourse here, I really doubt that we have the quantity of information necessary to really come to a well-informed consensus on any controversial subject.
I'm not sure exactly what kinds of issues the OP had in mind, but I worry that we wouldn't do particularly well here on topics like AGW, or what went wrong with the housing market, or how much of IQ is genetic, or nuclear power, or any number of other fascinating and important topics where a dose of rationality might do some good.
We wouldn't do well on these topics, in spite of our rationality, because doing well on these topics requires information, and we don't have any inside track to reliable information.
Perplexed:
I worry that we wouldn't do particularly well here on topics like AGW, or what went wrong with the housing market, or how much of IQ is genetic, or nuclear power... We wouldn't do well on these topics, in spite of our rationality, because doing well on these topics requires information, and we don't have any inside track to reliable information.
There are indeed such topics, but in my opinion, none of the specific ones you mention are among them. In all of these, the public information available at the click of a mouse (or, in the worst case, with a visit to a university library) is almost as good as anyone in the world has, and the truly difficult part is how to sort out correct insight from bullshit using general rules of good epistemology.
I believe that if Less Wrong users changed the posts they made and the posts/comments they upvoted to try and "confront wrongness wherever it appears," as you suggest, that the value of Less Wrong to human society would be significantly increased.
Contrary to the principles behind Less Wrong (or what I perceive to be the principles behind Less Wrong), I doubt that most Less Wrong users can get anywhere near as much value by thinking about being rational for a minute as they could by applying their rationality for a minute. Trying to develop consensus among Less Wrong users on difficult issues seems like it is almost certainly the most effective way we can apply our rationality in the context of making posts on Less Wrong, because consensus among Less Wrong users is a valuable thing to reference when making decisions which are actually important about issues which are not, strictly speaking, relevant topics for Less Wrong.
Whether the correct response is to stop reading/posting on Less Wrong or to confront wrongness on Less Wrong is unclear (you can tell from my post count what I have done in the past).
[In general I enjoy Less Wrong and believe its existence is good, but I also tend to agree with some recent posts that for most people it is a distraction rather than a resource and should be treated as such.]
Less Wrong has the highest accuracy and signal to noise ratio of any blog I've seen, other than those that limit themselves to narrow specialties.
I agree, there are very few disruptors and trolls here.
I'm inclined to agree with your proposal, but I wonder if there are supplementary community norms that, if made explicit, might make it easier to venture into confusing and polarizing topics without losing LW's usual level of accuracy and of signal to noise. (I assume fear of filling the blog with nonsense, and thereby losing some good readers/commenters, is much of what currently keeps e.g. political discussion off of LW.)
Maybe it would help to have heuristics such as "if you don't have anything clear and obviously correct to say, don't say anything at all", that could be reiterated and enforced when tricky topics come up.
"if you don't have anything clear and obviously correct to say, don't say anything at all"
Far from everything on LW is obviously correct.
"Less Wrong Should Confront Wrongness Wherever it Appears"
As an economist I can tell you that most public discussions of economics contain a huge amount of wrongness. Do we want hundreds of econ posts?
Economic policy is a prime example (and maybe a rare example) where learning a small amount of factual information -- basically, the content of Econ 101 -- is enough to change people's views, to the point that there are areas of rough normative consensus among everybody who knows Econ 101.
Economists are already popularizing Econ 101, very well in my opinion, and we don't have to do it here.
The other thing I want to point out is that we should not expect everything in the world to be like economics. It is not always the case that a little evidence and a little rational thought will create wide agreement. It is not always the case that opinions divide neatly into the "smart" and the "stupid." I really don't want LW to develop the attitude that "with a little rationality we can do better than the rest of the world!" on normative issues.
I, for one, would love to have a place where a rational and open-minded discussion of economics would be possible with people who have some knowledge of the subject. In my experience, and with very few honorable exceptions, economists are extremely difficult to reason with as soon as one starts questioning the logical and empirical soundness of some basic concepts in modern economics, or pointing out seemingly bizarre and illogical things found in the mainstream economic literature. You quickly run into an authoritative and stonewalling attitude of the sort that you never get by posing similar questions to, say, physicists.
I would venture to say that a radical re-examination of several basic economic concepts is probably the lowest-hanging fruit when it comes to valuable insight that could potentially be gained by a group of smart amateur contrarians. The whole field is certainly long overripe for the sort of treatment that Robin Hanson metes out to medicine.
Relsqui:
I'm sure this wasn't your intent, but this comes across to me like a situation where you've been having a high-level conversation about economics and then switch to asking the conomist to explain or justify the basic premises of the field to you.
The thing is, if you ask a physicist to answer a critical question you have about some fundamental thing in physics, he'll likely be able to point you to the literature where your specific conundrum is resolved clearly and in great detail, or provide such an answer himself. I don't know what would happen if you came up with an entirely novel question (I sure never did), but from what I've observed, I would expect that it would be met with genuine curiosity. Moreover, good introductory literature in physics often anticipates and preemptively answers many objections to the basic concepts that a smart critical student of the subject might come up with. Of course, if you're being block-headed and impervious to arguments, that's a different story, but that's not what I'm talking about.
In contrast, in economics one rarely sees anything like this. The concepts are presented with an air of high authority, and various more or less strai...
Proposal: Have this be another site that everyone can read, with hidden user names and no effect on karma, but that only people with a certain amount of karma on LW can post on.
This way we'd get the filtering process of good LW users, while it being unable to affect LW back any way I can see. And my intuition say it had a few more advantages as well but forgot what those were.
Please expand and improve this proposal with more suggestion, subsuggestions and metasuggestions!
Comment that helped inspire this idea: http://lesswrong.com/lw/2qi/less_wrong_should_confront_wrongness_wherever_it/2nmw?c=1
As a result, Less Wrong is well positioned to find and correct errors in the public discourse.
Risky. We could perhaps survive some discussion on public policy without any damage. But after a threshold would be crossed we would juts start fracturing into blue or green teams.
However what we need to do is analyze how many recruits we would loose by taking a stand on a certain issue (and even more damaging how many nonrationalists on "our" team we would attract) and compare the utility lost from future less rational behavior compared to what is g...
There's too much error out there to confront it all.
Stick to cases where there's something generally interesting to say.
Less Wrong Should Confront Wrongness Wherever it Appears
I approve of this norm and voted the post up. That is a high form of praise since I evaluate all 'should' claims that apply to anything I care about to a standard that is an order of magnitude more rigorous.
Very well said.
This is generating a lot of discussion about what Less Wrong's should be, and proposed norms. I like this one and this one, for example.
Well, the norms of Less Wrong are currently scattered between a large number of previous posts, comments, and wiki pages. It is very difficult for a newcomer to find out what they are, and easy for long-time participants to forget about some of them.
Therefore, I propose gathering all of Less Wrong's policies and norms into a top-level post. I also think it's very important that the contents of that post be representative ...
if a topic seems to have two valid opposing sides, it probably means you don't understand it well enough to tell which is correct
Or that perhaps those sides are extremists while some optimal balance between the positions is required. E.g. "vitamin supplements are useless and evil" and "you will probably die unless you have some" are both unlikely to be "correct".
(OTOH, perhaps you meant "which is correct" in a sense that includes other possibilities besides the two opposing ones, in which case, never mind, carry on. ;-) )
I think that we definitely should find some problems to tackle that are not minefields of bias and confusion, but are still challenging.
I think these should be within a field that is mature enough that there will be hard open problems, but also problems which look hard to a novice, but have definite answers to an expert.
To keep us interested, it should also be something that is more practical, so that fixing the big things or learning how to do the everyday things will help ourselves or others.
Fine, I'll be the first to give directions to the minefield.
So, for the other Americans here: Are you going to vote in the upcoming election, and which party's candidate(s) are you voting for? ;)
I intend to vote for the Democrat Rush Holt for Congress. (Neither Senator from my state is up for reelection this year.) Anyone want to try talk me out of it?
Is that the kind of post you wanted?
Less Wrong has the highest accuracy and signal to noise ratio of any blog I've seen, other than those that limit themselves to narrow specialties. In fact, I doubt anyone here knows a better one. The difference is very large.
... except when it comes to group-serving bias?
It is better to link to the original archive page than to the image, and better to link to the creator's site than to an unaffiliated site.
Less Wrong should confront wrongness wherever it appears.
The wrongularity - a point of infinite wrongness, from which right cannot emerge - must be near.
Different topics are differently important by many orders of magnitude, and differently easy to say something new and persuasive about. That suggests we should take great care to pick our battles, unless the point is to draw in new people by feeding off popular controversy -- but that doesn't seem to be your aim here.
In a recent discussion about a controversial topic which I will not name here, Vladimir_M noticed something extremely important.
I have separated it from its original context, because this issue applies to many important topics. There are many topics where the information that most people receive is confused, wrong, or biased, and where nonsense drowns out truth and clarity. Wherever this occurs, it is very bad and very important to notice.
There are many reasons why it happens, many of which have been explicitly studied and discussed as topics here. The norms and design of the site are engineered to promote clarity and correctness. Strategies for reasoning correctly are frequently recurring topics, and newcomers are encouraged to read a large back-catalog of articles about how to avoid common errors in thinking (the sequences). A high standard of discourse is enforced through voting, which also provides rapid feedback to help everyone improve their writing. Since Well-Kept Gardens Die by Pacifism, when the occasional nutjob stops by, they're downvoted into invisibility and driven away - and while you wouldn't notice from the comment archives, this has happened lots of times.
Less Wrong has the highest accuracy and signal to noise ratio of any blog I've seen, other than those that limit themselves to narrow specialties. In fact, I doubt anyone here knows a better one. The difference is very large. While we are certainly not perfect, errors on Less Wrong are rarer and much more likely to be spotted and corrected than on any similar site, so a community consensus here is a very strong signal of clarity and correctness.
As a result, Less Wrong is well positioned to find and correct errors in the public discourse. Less Wrong should confront wrongness wherever it appears. Wherever large amounts of utility depend on clear and accurate information, it's not already prevalent, and we have the ability to produce or properly filter that information, then we ought to do so and lots of utility depends on it. Even if it's incompatible with status signaling, or off topic, or otherwise incompatible with non-vital social norms.
So I propose the following as a community norm. If a topic is important, the public discourse on it is wrong for any reason, it hasn't appeared on Less Wrong before, and a discussion on Less Wrong would probably bring clarity, then it is automatically considered on-topic. By important, I mean topics where inaccurate or confused beliefs would cost lots of utility for readers or for humanity. Approaching a topic from a new and substantially different angle doesn't count as a duplicate.
EDIT: This thread is producing a lot of discussion about what Less Wrong's norms should be. I have proposed a procedure for gathering and filtering these discussions into a top-level post, which would have the effect of encouraging people to enforce them through voting and comments.
Less Wrong does not currently provide strong guidance about what is considered on topic. In fact, Less Wrong generally considers topic to be secondary to importance and clarity, and this is as it should be. However, this should be formally acknowledged, so that people are not discouraged from posting important things just because they think they might be off topic! Determining whether something is on topic is a trivial inconvenience of the worst sort.
When writing posts on these topics, it is a good idea to call out any known reasons why the public discourse may have gone awry, to avoid hitting the same traps. If there's a related but different position that's highly objectionable, call it out and disclaim against it. If there's a standard position which people don't want to or can't safely signal disagreement with, then clearly label which parts are true and which aren't. Do not present distorted views of controversial topics, but more importantly, do not present falsehood as truth in the name of balance; if a topic seems to have two valid opposing sides, it probably means you don't understand it well enough to tell which is correct. If there are norms suppressing discussion, call them out, check for valid justifications, and if they're unjustified or the issues can be worked around, ask readers not to enforce them.
I would like to add a list of past Less Wrong topics which had little to do with bias, except that the public discourse was impaired by it. These have already been discussed so they would be discouraged as duplicates rule (except for substantially new approaches), but they are good examples of the sorts of topics we should all be looking for. The accuracy of criminal justice (which we looked at in the particular case of Amanda Knox); religion, epistemology, and death; health and nutrition, akrasia, specific psychoactive drugs and psychoactive drugs in general; gender relations, racial relations, and social relations in general; social norms in general and the desirability of particular norms; charity in general and the effectiveness of particular charities, philosophy in general and the soundness of particular philosophies.
By inadequate public discourse, I mean that either (a) they're complex enough that most information sources are merely useless and confusing, (b) social norms make them hard to talk about, or (c) they have excessive noise published about them due to bad incentives. Our job is to find more topics, not in this list, where correctness is important and where the public dialogue is substantially inadequate. Then write something that's less wrong.