Wei_Dai comments on The Mechanics of Disagreement - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (25)
Hal, is this still your position, a year later? If so, I'd like to argue against it. Robin Hanson wrote in http://hanson.gmu.edu/disagree.pdf (page 9):
Robin argues in another paper that differences in priors really are irrational. I presume that he believes that differences in computation are also irrational, although I don't know if he made a detailed case for it somewhere.
Suppose we grant that these differences are irrational. It seems to me that disagreements can still be "reasonable", if we don't know how to resolve these differences, even in principle. Because we are products of evolution, we probably have random differences in priors and computation, and since at this point we don't seem to know how to resolve these differences, many disagreements may be both honest and reasonable. Therefore, there is no need to conclude that the other disagreer must be be irrational (as an individual), or is lying, or is not truth seeking.
Assuming that the above is correct, I think the role of a debate between two Bayesian wannabes should be to pinpoint the exact differences in priors and computation that caused the disagreement, not to reach immediate agreement. Once those differences are identified, we can try to find or invent new tools for resolving them, perhaps tools specific to the difference at hand.
My Bayesian wannabe paper is an argument against disagreement based on computation differences. You can "resolve" a disagreement by moving your opinion in the direction of the other opinion. If failing to do this reduces your average accuracy, I feel I can call that failure "irrational".
Do you have a suggestion for how much one should move one's opinion in the direction of the other opinion, and an argument that doing so would improve average accuracy?
If you don't have time for that, can you just explain what you mean by "average"? Average over what, using what distribution, and according to whose computation?
How confident are you? How confident do you think your opponent is? Use those estimates to derive the distance you move.
It would be clearer if you said "epistemically irrational". Instrumental rationality can be consistent with sticking to your guns - especially if your aim involves appearing to be exceptionally confident in your own views.