Many of my posts have turned out to be wrong on later examination. Looking back, the reason every time was that I tried to solve a hard problem, found a solution that looked new and interesting, and failed to examine it enough before posting.
The phrase "worse than random" in your post sounds misleading to me. Many questions have answers in the form of long sentences, not just yes/no. A random number generator has an astronomically low chance of generating a coherent sentence, never mind interesting or correct. So even if I'm wrong every time, I still like to think that I'm doing better than random: my faulty proofs can be patched, and my faulty explanations can still point toward the truth.
A random number generator has an astronomically low chance of generating a coherent sentence, never mind interesting or correct.
Good point. I would probably have to keep a careful record only of statements on yes/no questions to get around this.
I realize that there are strong social reasons for not giving an example, but it is very hard to determine which of these hypotheses is correct without one.
I'm actually not sure the usual social reasons apply here. He's already flattered their intelligence and I can't understand why someone would be offended by a public admission of disagreement. Also, assuming the posters he is referring to have high status here already (a reasonable assumption given that they are smart and post a lot) I can't see why they would be uncomfortable mentioned by name-- shoot the way this post presents them I'd be very pleased with myself if Phil was talking about me.
If there is some possible concern I'm missing Phil could at least send the people in question PMs, asking their permission to use them as examples.
I notice that you haven't noted whether they tend to get upvoted or downvoted, and a hypothesis as to why this is.
"Usually wrong" is a strong claim. I don't know of anyone on LW who is literally "usually wrong" in the sense that the majority of the statements they make are false; the majority of statements most people make are trivially true, like "The sky is blue" or "I'm hungry".
To say someone is "usually wrong", you have to mean "out of the controversial statements they make, most are wrong". But "controversial statements" are hard to define, and on border cases, whether or not you classify a statement as controversial might depend on whether or not you agree with it. I might classify "global warming does not exist" as a hit against someone in the "are they usually right or wrong" tables without necessarily remembering to award someone a point every time they say "global warming exists". I would definitely count "the world is controlled by sentient lizards" as a hit against someone without counting "the world is not controlled by sentient lizards" as a point in their favor.
If you have this same bias, that would make you more likely to believe someone is "usually wrong", when you may only disagree with them on a few things.
Although this is a bit close to the last hypothesis, I would suggest another alternative, the possibility that they are being meta-contrarian.
They advocate (possible wrong) opinions to signal that they stand out of the crowd. Did I unpack that right?
Not to ones that they themselves suspect are wrong, if that's what you mean. But it's hard to signal high intellectual status while expressing the same beliefs as all of your peers, so if you want to signal, you have a motive to find a point of disagreement.
Do you find them to be usually wrong in most of the subjects they discuss?
I also have several posters whom I have mentally labeled as both "generally smart" and "holds some unbelievably stupid ideas" - including even a couple of the Top Contributors - but the second label applies only in one or a few specific areas for each such poster, so I tend to believe that they are showing the consequences of one particular bias or preconception on a particular subject, rather than something being unusually wrong with their general mental processes.
Remember that idea space is large, so maybe what you have found is people with a significantly different perspective/worldview on things. "Flipping a coin" is not a good analogy.
Consider the following case which I think would generate similar observations. LW members are of two different worldviews wv1 and wv2, and their opinions on some subset of topics is primarily determined by their worldview. Then whenever you write a post reflecting your worldview, the people with a different worldview will disagree consistently.
I suppose this is really a restatement of your last theory.
I have definitely found this to be the case at least when it comes to somewhat political issues, even among quite smart people.
Reversed intelligence does not look like natural stupidity. If Alice consistently disagrees with you, then she must (1) be as smart as you and (2) want to taunt you. Point (1) is unnecessary if she has prior access to your opinions (does she?), but point (2) is very weird. (Taunting LW generally is much more plausible than taunting you personally.) Another causal link that avoids (2) is that she might annoy you so much that you change your opinion to oppose her. I very much doubt that these people accurately negate your personal opinions, but if they do, this seems to me the most likely scenario. (maybe this is #3?)
Natural disagreement, whether due to your error or Alice's, should look like she regresses from you to some reference population, such as LW's, the general population's, or an ideology. Thus David Gerard's question seems to be key.
Your three hypotheses about attention seem likely to me:
- What I am actually detecting is smart people who have strong opinions on, and are likely to comment on, areas where I am either wrong, or have a minority opinon.
- These people comment only on difficult, controversial issues which are selected as issues where people perform worse than random.
- Many of these comments are in response to comments or posts I made, which I made only because I thought they were interesting because I already disagreed with smart people about the answers.
I don't think what he describes requires reversed intelligence. There are many more ways to disagree than there are to agree.
There's an old canard about the media in general and The Economist in particular, that it seems less insightful the more you know about a topic, because it's written by people who are smart and perceptive but not experts in anything much but writing. I may be projecting here, but I think a lot of people on this board are overconfident in their ability to overturn conventional wisdom in fields they know only from osmosis and popular non-fiction, we think being smart is an acceptable substitute for being knowledgeable (and yes, I recognize the irony in that mistake).
If this is true, I suspect that most of your data points will be from historical questions, which are the easiest to think you understand, and tend to be speculative or critical points rather than thorough explanations, is this accurate.
Everyone's judgment and prejudices are influenced by their life experiences and, to some extent, their personalities. On just about any topic where the facts aren't entirely clear, intelligent people are not going to all agree. The directions in which they disagree will be determined by judgment and prejudices. It makes sense that people on this board will find themselves disagreeing with the same people repeatedly.
It would be helpful to know more about the sorts of things they typically post about, but I understand you probably don't want to inadvertently reveal the smart-wrong individuals you have in mind.
The reason I say this is because there are probably a lot of people out there like me - people who, while liking the LessWrong community and its stewards overall, have some serious bones to pick with some of the "core" "rationalist" beliefs and approaches to various questions. These two things in combination beget an urge to respectfully but persistently voice disagreement with things most others here take as received wisdom. I haven't ever really posted here, for instance, but if I did, I know that I would mainly only do so when I disagreed with something I felt most people here took to be obvious. Or, to put a more positive spin on it, I would only post when I stood an unusually high chance of being corrected-if-wrong. Since the opinions of mine that stand the best chance of this here are those that are far out of line with the beliefs of the average LessWrong user, it follows that if your stance is sufficiently close to that of the average LWer, most of what I'd post you'd disagree with.
Of course, I may not count, because I may not be smart! And you may deviate greatly from the average. But in that case, it should be pretty obvious why you find smart people you persistently disagree with on this site.
(NB: I put the quotes around the word "rationalist" not to cast aspersions on its use by people here, but because I wouldn't consider myself an LW-style rationalist despite being someone who cares deeply about thinking rationally. Kind of like how an agnostic might opt out of calling atheists "brights" on account of not being an atheist while still considering himself bright.)
These people comment only on difficult, controversial issues which are selected as issues where people perform worse than random.
Related, maybe they only comment when they have something original and unorthodox to say (selection bias). It's easy to echo conventional wisdom and be right most of the time; for a smart person it's more exciting to challenge conventional wisdom, even if this gives them a higher risk of being wrong. In other words, maybe they place a lower priority on karma points, and more on building their muscles for original thought.
Example 1: In my youth, I tried to only hold beliefs I could derive myself, rather than accepting what was told to me. As a result, I held many unorthodox beliefs, many of which turned out to be wrong. Statistically, I would have had a better track record if I had just accepted the conventional view.
Example 2: Robin Hanson. I think he is wrong a lot of the time, but he also thinks for himself a lot more than I do, and has advanced human thought way more than I have. He could easily hold more conventional views and increase his accuracy, but I'm sure he finds the risk and challenge appealing.
I notice that you seem to exclude the possibility that it is they who are right and you who is wrong.
I point that out because that is one of the main reasons smart people hold wrong beliefs.
Maybe making a firm statement like "The universe behaves in X fashion" makes for more brief and interesting conversation than "The universe, which may or may not exist, may or may not behave in X fashion under conditions {abc} but not under conditions {def}"
If someone is pretty sure about a thing, they are probably likely to say it is true, rather than nearly certainly true according their subjective understanding.
That is to say, one factor which may or may not be a cause, but not the only cause, of that which you are observing, is the desire for others to be brief in their statements, sacrificing complete thoroughness and rigor either intentionally or unintentionally... or perhaps not.
There are several posters on Less Wrong whom I
So I think they are exceptionally smart people whose judgement is consistently worse than if they flipped a coin.
How probable is this?
Some theories: