You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

IlyaShpitser comments on Open thread, 25-31 August 2014 - Less Wrong Discussion

4 Post author: jaime2000 25 August 2014 11:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (227)

You are viewing a single comment's thread. Show more comments above.

Comment author: IlyaShpitser 25 August 2014 11:24:44PM *  0 points [-]

If A,B,C are binary, values of A and B are drawn from independent fair coins, and C = A XOR B, then measuring C = 1 constrains A,B to be either { 1, 1 } or { 0, 0 }, but does not constrain A alone at all.

Before we conditioned on C=1, all values of the joint variable A,B had probabilities 0.25, and all values of a single variable A had probabilities 0.5. After we conditioned on C=1, values { 0, 0 } and { 1, 1 } of A,B assume probabilities 0.5, and values { 0, 1 } and { 1, 0 } of A,B assume probabilities 0, values of a single variable A remain at probability 0.5.

By conditioning on C=1, you learn more about the joint variable A,B than about a single variable A (because your posterior for A,B changed, but your posterior for A did not), but that is not the same thing as the joint variable A,B being more plausible than the single variable A. In fact, it is still the case that p(A & B | C) <= p(A | C) for all values of A,B.


edit: others below said the same, and often better.