deleted
deleted
When it comes to issues like superstition, abortion, gender, race, pseudoscience, consciousness, etc. so far everyone who disagrees with my views tends to demonstrate horrible illogical arguments or even a complete lack of justification for their views. This doesn't really apply to LWers, I'm talking about outside of this site.
How about people who agree with your views - how many of them have horrible illogical arguments or even a complete lack of justification?
LessWrong is heavily biased towards the kind of people who can articulate why exactly they believe something in a logical-sounding way, compared to people from a random internet forum or people one might meet in person. So maybe what you're seeing is just that a lot of people have horribly illogical justifications, it's just that the only time you have to listen to a justification for a belief is when you disagree with someone.
Conversations rarely go:
- I think homosexual marriage should be legal.
- I fully agree, but *why* do you believe that?
- Well, because <illogical and incoherent justification>
In terms of strength of the effect, I'd guess it's something like:
1) stupidity is correlated with both wrong opinions and bad justifications
2) you're more likely to encounter explanations for beliefs you disagree with
3) your judgement of which justifications are good or bad is biased by whether they support your beliefs
This is exactly the case, and it is sometimes quite shocking to hear sincere justifications from people who agree with you that are insanely wrong.
While taking the ideological turing test I had a hell of a time fighting the urge to label people giving bad arguments for atheism as Christians. It turns out that many atheists have very bad reasons for being atheists. Not all skeptics are rationalists.
There's also the effect of topic selection. I've seen an example on an atheist forum in which users were astoundingly adept at refuting creationist arguments, but as soon as anybody posted a silly economic conspiracy theory, the posters seemed to take it at face value. It turns out that, politically, you can predict most people's opinions one one thing by their opinion on another thing (e.g. opinions about abortion correlate with opinions about gun control). Try prodding around with people who agree with you on one issue and see how quickly they become insane when you bring up an unrelated issue.
Conversations rarely go:
- I think homosexual marriage should be legal.
- I fully agree, but why do you believe that?
- Well, because
You haven't been to a Mises.org discussion of intellectual property rights, have you?
Perhaps I am, myself, victim to some bias that I have not read about despite months on this site. Does anyone recognize it? Or in general have anything to add to this.
Have you kept a list of things you were wrong on? Or do you simply change your mind that rarely and are that accurate?
I've been on these forums back when it was still the SL4 mailing list, and I think I've internalized a lot of it, and I contribute good content (I'm not in the top karma list, but I've not that far away). And yet, I am still fallible:
My predictions routinely fail, sometimes even very high confidence predictions; see:
Are people who agree with you significantly more likely to put forth good arguments, or does everyone demonstrate horrible illogic, regardless of whether they agree with you?
Are you counting occasions on which someone persuaded you to change your mind to agree with them as counter-evidence?
Part of it is selection bias, I would say. You think that most people who disagree with you are illogical, but you don't say anything about those who agree with you. Are they more logical than the average person? (If most people make terrible arguments, then it's probably the case that most people who disagree with you will make terrible arguments.)
Also,
Perhaps I am, myself, victim to some bias that I have not read about despite months on this site. Does anyone recognize it?
You think that most people who disagree with you are illogical, but you don't say anything about those who agree with you. Are they more logical than the average person?
Yes. It doesn't take more than a slight bias in me for logical thinking and selection effect will see to that.
Dunning-Kruger is only applicable when the person is incompetent but believes himself competent. The very act of questioning one's own competence is sufficient grounds to dismiss D-K effect as a possibility.
You should expect that (even in our crazy world) there are some people who hold a set of beliefs such that to disagree with them is to be wrong. You prior for being that person is low
I am naturally distrustful of such conclusions
but you have some evidence
my experiences in schools made it very hard.
that you feel is not enough to overcome the prior improbability and produce a decent posterior? No, according to the line
I think the most likely thing is that they're all confused by biases
you do have such a posterior.
Ah. You're worried that the evidence you have does not sufficiently distinguish between 'I am rational' and 'I suffer from the Dunning-Kruger effect'. In that case, think about what evidence would sufficiently distinguish 'I am right a lot' and 'I think I am right a lot'. This recommendation is a good place to start.
Downvoted as this reads a little like trolling, and the title isn't exactly accurate. I believe, from your first paragraph, that you mean "how come people who disagree with me put forth bad arguments." Someone isn't necessarily incorrect for having poor rhetoric- its worth noting that the objections they give might not be their true objection. If I were to say "the earth is 6 billion years old because scientists have carbon dated rocks which date that far" I would have given a bad argument- carbon dating can't be used to do that. The reason the fictional me accepts the age of the earth is probably because most scientific authorities say so, which isn't a bad metric for our hypothetical scientific incompetent.
Its worth noting that being right on aborion, gender and race rather depends on precisely how one defines being correct.
There are a lot of good responses here but I'd like to bring up just one more possible explanation of what is going on. Both you and the people you are talking to may be expecting shorter inferential distances so they (and you) may have implicit facts and premises which aren't being stated in these discussions. This could lead easily to arguments where someone else seems to be illogical.
The paper Are Disagreements Honest? by Tyler Cowen and Robin Hanson seems relevant. I strongly recommend reading the paper, but the abstract summarizes the main relevant point:
Typical disagreement seems explainable by a combination of random belief influences and by priors that tell each person that he reasons better than others.
(Edited for clarity)
Is there any controversial position you have studied and rejected? Try arguing that as a control. Argue in a way worth refuting and see if people can. As kam says, check both people who agree and disagree with you.
Once you establish that people are unable to communicate justification for any of their views, then what? Don't jump to the conclusion that they don't have justifications. Why would they want to share with you? If very few arguments are floating around, you should not be proud of having considered all the ones you have encountered.
Humblest apologies to the OP if this isn't the case, but does this read a little bit like a Trolling attempt to anyone else?
Until the very end I was sure it was a hilarious piece of rationalist satire. "How come everyone that disagrees with me is wrong?" is like the funniest sentence I've read today.
I find that I am capable of changing my mind more often than other people apparently do - which might produce similar results, namely that I am wrong for a shorter period of time.
Have you kept a list of things you were wrong on? Or do you simply change your mind that rarely and are that accurate?
I've been on these forums back when it was still the SL4 mailing list, and I think I've internalized a lot of it, and I contribute good content (I'm not in the top karma list, but I've not that far away). And yet, I am still fallible:
My predictions routinely fail, sometimes even very high confidence predictions; see: