JGWeissman comments on Information theory and the symmetry of updating beliefs - Less Wrong

45 Post author: Academian 20 March 2010 12:34AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (28)

You are viewing a single comment's thread. Show more comments above.

Comment author: JGWeissman 21 March 2010 09:23:54PM *  1 point [-]

[EG1] P(A)=1/4, P(B)=P(C)=1/2, P(BC)=1/4, P(AC)=P(BC)=1/16, P(ABC)=1/64.

I assume you meant:

[EG1] P(A)=1/4, P(B)=P(C)=1/2, P(BC)=1/4, P(AC)=P(AB)=1/16, P(ABC)=1/64.

Then just suppose there are some medical diagnostics with these probabilities, etc.

You just glossed over the whole point of this exercise. The problem is that values such as P(ABC) are combined facts about the population and both tests. Try defining the scenario using only facts about the population in isolation (P(A)), about the tests in isolation (P(B|A), P(C|A), P(B|~A), P(C|~A)), and the dependence between the tests (P(B|AC), P(B|A~C), P(B|~AC), P(B|~A~C), [EDIT: removed redundant terms, I got carried away when typing out the permutations]). The point is to demonstrate how you have to contrive certain properties of the dependence between the tests, the conditional probabilities given ~A, to make summary properties you care about work out for a specific separate fact about the population, P(A).