In a recent discussion about a controversial topic which I will not name here, Vladimir_M noticed something extremely important.
Because the necessary information is difficult to obtain in a clear and convincing form, and it's drowned in a vast sea of nonsense that's produced on this subject by just about every source of information in the modern society.
I have separated it from its original context, because this issue applies to many important topics. There are many topics where the information that most people receive is confused, wrong, or biased, and where nonsense drowns out truth and clarity. Wherever this occurs, it is very bad and very important to notice.
There are many reasons why it happens, many of which have been explicitly studied and discussed as topics here. The norms and design of the site are engineered to promote clarity and correctness. Strategies for reasoning correctly are frequently recurring topics, and newcomers are encouraged to read a large back-catalog of articles about how to avoid common errors in thinking (the sequences). A high standard of discourse is enforced through voting, which also provides rapid feedback to help everyone improve their writing. Since Well-Kept Gardens Die by Pacifism, when the occasional nutjob stops by, they're downvoted into invisibility and driven away - and while you wouldn't notice from the comment archives, this has happened lots of times.
Less Wrong has the highest accuracy and signal to noise ratio of any blog I've seen, other than those that limit themselves to narrow specialties. In fact, I doubt anyone here knows a better one. The difference is very large. While we are certainly not perfect, errors on Less Wrong are rarer and much more likely to be spotted and corrected than on any similar site, so a community consensus here is a very strong signal of clarity and correctness.
As a result, Less Wrong is well positioned to find and correct errors in the public discourse. Less Wrong should confront wrongness wherever it appears. Wherever large amounts of utility depend on clear and accurate information, it's not already prevalent, and we have the ability to produce or properly filter that information, then we ought to do so and lots of utility depends on it. Even if it's incompatible with status signaling, or off topic, or otherwise incompatible with non-vital social norms.
So I propose the following as a community norm. If a topic is important, the public discourse on it is wrong for any reason, it hasn't appeared on Less Wrong before, and a discussion on Less Wrong would probably bring clarity, then it is automatically considered on-topic. By important, I mean topics where inaccurate or confused beliefs would cost lots of utility for readers or for humanity. Approaching a topic from a new and substantially different angle doesn't count as a duplicate.
EDIT: This thread is producing a lot of discussion about what Less Wrong's norms should be. I have proposed a procedure for gathering and filtering these discussions into a top-level post, which would have the effect of encouraging people to enforce them through voting and comments.
Less Wrong does not currently provide strong guidance about what is considered on topic. In fact, Less Wrong generally considers topic to be secondary to importance and clarity, and this is as it should be. However, this should be formally acknowledged, so that people are not discouraged from posting important things just because they think they might be off topic! Determining whether something is on topic is a trivial inconvenience of the worst sort.
When writing posts on these topics, it is a good idea to call out any known reasons why the public discourse may have gone awry, to avoid hitting the same traps. If there's a related but different position that's highly objectionable, call it out and disclaim against it. If there's a standard position which people don't want to or can't safely signal disagreement with, then clearly label which parts are true and which aren't. Do not present distorted views of controversial topics, but more importantly, do not present falsehood as truth in the name of balance; if a topic seems to have two valid opposing sides, it probably means you don't understand it well enough to tell which is correct. If there are norms suppressing discussion, call them out, check for valid justifications, and if they're unjustified or the issues can be worked around, ask readers not to enforce them.
I would like to add a list of past Less Wrong topics which had little to do with bias, except that the public discourse was impaired by it. These have already been discussed so they would be discouraged as duplicates rule (except for substantially new approaches), but they are good examples of the sorts of topics we should all be looking for. The accuracy of criminal justice (which we looked at in the particular case of Amanda Knox); religion, epistemology, and death; health and nutrition, akrasia, specific psychoactive drugs and psychoactive drugs in general; gender relations, racial relations, and social relations in general; social norms in general and the desirability of particular norms; charity in general and the effectiveness of particular charities, philosophy in general and the soundness of particular philosophies.
By inadequate public discourse, I mean that either (a) they're complex enough that most information sources are merely useless and confusing, (b) social norms make them hard to talk about, or (c) they have excessive noise published about them due to bad incentives. Our job is to find more topics, not in this list, where correctness is important and where the public dialogue is substantially inadequate. Then write something that's less wrong.
Risky. We could perhaps survive some discussion on public policy without any damage. But after a threshold would be crossed we would juts start fracturing into blue or green teams.
However what we need to do is analyze how many recruits we would loose by taking a stand on a certain issue (and even more damaging how many nonrationalists on "our" team we would attract) and compare the utility lost from future less rational behavior compared to what is gained.
You confuse having information with convincing other people to believe in your analysis.
No. Just no.
The only time you can ignore signaling is when your positions just happen to match good signals. You can afford a occasional action that sends bad signals, but things like policy or ideology or even positions need to be couched in nice signals.
Also there is no consensus on what is a vital social norm. I think we can agree that common random killing sprees would perhaps fall into this. But even that in itself if surrounded by the right memetic scaffolding could be made to work (perhaps a "kill the guy you hate day" could work). Social stability and prosperity is mostly polygenic.
Any talk of social norms is always a compromise between people of different values. The exact point of compromise depends heavily on the cost or benefit it, however a debate about this can never be made without taking values into discussion. And people have an incentive to propose bad policy to the point of outright deception when it is in accordance to their values.
Kill likely to succeed AGI creators who haven't created a sane goal system (when no other means will work to stop them). Although I know Tim doesn't accept even that exception.