Cyan comments on Why (and why not) Bayesian Updating? - Less Wrong

17 Post author: Wei_Dai 16 November 2009 09:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread. Show more comments above.

Comment author: SilasBarta 19 November 2009 11:40:44PM *  1 point [-]

If you separate out the variables into simultaneous pairs, then yes, you've destroyed the mutual information.

But if someone is allowed to look at the timewise development of each variable, they would see the mutual information, which necessarily results from one causing the other! If A causes B, then, by knowing A, you require less data to describe B (than if you did not know or could not reference A). That's the very definition of mutual information.

You can't just say that because the simultaneous pairs are uncorrelated, there is no mutual information between the variables. You showed as much when you later demonstrated that the simultaneous pairs between a function and its derivative are uncorrelated. But who denies that learning a function tells you something about its derivative? (which would mean there's mutual information between the two...)

Comment author: Cyan 20 November 2009 03:36:35AM 1 point [-]

If A causes B...

No need to bring up causality. It's enough that knowledge of A specifies B too.

Comment author: SilasBarta 20 November 2009 04:57:26PM 1 point [-]

Yes, that's correct. I only mentioned causality to make my comment relevant to the context Kennaway brought up.