You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Vladimir_Nesov comments on Stupid Questions Open Thread Round 4 - Less Wrong Discussion

6 Post author: lukeprog 27 August 2012 12:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (179)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 27 August 2012 02:22:35PM *  0 points [-]

One wrinkle is that even Bayesians shouldn't have prior probabilities for everything, because if you assign a prior probability to something that could indirectly depend on your decision, you might lose out.

... your probability of arriving at the second intersection depends on your decision to go straight or turn at the first one, so treating it as unchangeable leads to weird errors.

"Unchangeable" is a bad word for this, as it might well be thought of as unchangeable, if you won't insist on knowing what it is. So a Bayesian may "have probabilities for everything", whatever that means, if it's understood that those probabilities are not logically transparent and some of the details about them won't necessarily be available when making any given decision. After you do make a decision that controls certain details of your prior, those details become more readily available for future decisions.

In other words, the problem is not in assigning probabilities to too many things, but in assigning them arbitrarily and thus incorrectly. If the correct assignment of probability is such that the probability depends on your future decisions, you won't be able to know this probability, so if you've "assigned" it in such a way that you do know what it is, you must have assigned a wrong thing. Prior probability is not up for grabs etc.