Viliam_Bur comments on Open thread, August 4 - 10, 2014 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (307)
There is a common idea in the “critical thinking”/"traditional rationality" community that (roughly) you should, when exposed to an argument, either identify a problem with it or come to believe the argument’s conclusion. From a Bayesian framework, however, this idea seems clearly flawed. When presented with an argument for a certain conclusion, my failure to spot a flaw in the argument might be explained by either the argument’s being sound or by my inability to identify flawed arguments. So the degree to which I should update in either direction depends on my corresponding prior beliefs. In particular, if I have independent evidence that the argument’s conclusion is false and that my skills for detecting flaws in arguments are imperfect, it seems perfectly legitimate to say, “Look, your argument appears sound to me, but given what I know, both about the matter at hand and about my own cognitive abilities, it is much more likely that there’s a flaw in your argument which I cannot detect than that its conclusion is true.” Yet it is extremely rare to see LW folk or other rationalists say things like this. Why is this so?
A similar situation that used to happen frequently to me in real life, was when the argument was too long, too complex, used information that I couldn't verify... or ever could, but the verification would take a lot of time... something like: "There is this 1000 pages long book containing complex philosophical arguments and information from non-mainstream but cited sources, which totally proves that my religion is correct." And there is nothing obviously incorrect within the first five pages. But I am certainly not going to read it all. And the other person tries to use my self-image of an intelligent person against me, insisting that I should promise that I will read the whole book and then debate about it (which is supposedly the rational thing to do in such situation: hey, here is the evidence, you just refuse to look at it), or else I am not really intelligent.
And in such situations I just waved my hands and said -- well, I guess you just have to consider me unintelligent -- and went away.
I didn't think about how to formalize this properly. It was just this: I recognize the trap, and refuse to walk inside. If it happened to me these days, I could probably try explaining my reaction in Bayesian terms, but it would be still socially awkward. I mean, in the case of religion, the true answer would show that I believe my opponent is either dishonest or stupid (which is why I expect him to give me false arguments); which is not a nice thing to say to people. And yeah, it seems similar to ignoring evidence for irrational reasons.
Nothing, including rationality, requires you to look at ALL evidence that you could possibly access. Among other things, your time is both finite and valuable.