Cyan comments on Why (and why not) Bayesian Updating? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (26)
If you separate out the variables into simultaneous pairs, then yes, you've destroyed the mutual information.
But if someone is allowed to look at the timewise development of each variable, they would see the mutual information, which necessarily results from one causing the other! If A causes B, then, by knowing A, you require less data to describe B (than if you did not know or could not reference A). That's the very definition of mutual information.
You can't just say that because the simultaneous pairs are uncorrelated, there is no mutual information between the variables. You showed as much when you later demonstrated that the simultaneous pairs between a function and its derivative are uncorrelated. But who denies that learning a function tells you something about its derivative? (which would mean there's mutual information between the two...)
No need to bring up causality. It's enough that knowledge of A specifies B too.
Yes, that's correct. I only mentioned causality to make my comment relevant to the context Kennaway brought up.