wnoise comments on A probability question - Less Wrong

6 Post author: PhilGoetz 19 October 2012 10:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: wnoise 20 October 2012 05:44:45AM 2 points [-]

The usual situation is that both detectors actually have some correlation to Q, and thereby have some correlation to each other.

This need not be the case. Consider a random variable Z that is the sum of two random independent variables X and Y. Expert A knows X, and is thus correlated with Z. Expert B knows Y and is thus correlated with Z. Expert A and B can still be uncorrelated. In fact, you can make X and Y slightly anticorrelated, and still have them both be positively correlated with Z.

Comment author: buybuydandavis 20 October 2012 08:47:27AM 0 points [-]

Just consider the limiting case - both are perfect predictors of Q, with value 1 for Q, and value 0 for not Q. And therefore, perfectly correlated.

Consider small deviations from those perfect predictors. The correlation would still be large. Sometimes more, sometimes less, depending on the details of both predictors. Sometimes they will be more correlated with each other than with Q, sometimes more correlated with Q than each other. The degree of correlation with of A and B with Q will impose limits on the degree of correlation between A and B.

And of course, correlation isn't really the issue here anyway, much more like mutual information, with the same sort of triangle inequality limits to the mutual information.

If someone is feeling energetic and really wants to work this our, I'd recommend looking into triangle inequalities for mutual information measures, and the previously mentioned work by Jaynes on the maximum entropy estimate of a variable from it's known correlation with two other variables, and how that constrains the maximum entropy estimate of the correlation between the other two.