Vaniver comments on Philosophy Needs to Trust Your Rationality Even Though It Shouldn't - LessWrong

27 Post author: lukeprog 29 November 2012 09:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (169)

You are viewing a single comment's thread.

Comment author: Vaniver 30 November 2012 06:18:49AM 1 point [-]

(As likelihood ratios get smaller, your priors need to be better and your updates more accurate.)

It seems to me that rationality is more about updating the correct amount, which is primarily calculating the likelihood ratio correctly. Most of the examples of philosophical errors you've discussed come from not calculating that ratio correctly, not from starting out with a bizarre prior.

For example, consider Yvain and the Case of the Visual Imagination:

Upon hearing this, my response was "How the stars was this actually a real debate? Of course we have mental imagery. Anyone who doesn't think we have mental imagery is either such a fanatical Behaviorist that she doubts the evidence of her own senses, or simply insane."

This looks like having the same prior as many other people; the rationality was in actually running the experiment and calculating the likelihood ratio, which was able to overcome the extreme prior. You could say that Galton only considered this because he had a non-extreme prior, and that if people trusted their intuitions less and had more curious agnosticism, their beliefs would converge faster. But it seems to me that the curiosity (i.e. looking for evidence that favors one hypothesis over another) is more important than the agnosticism- the goal is not "I could be wrong" but "I could be wrong if X."