minusdash comments on Beyond Statistics 101 - Less Wrong

19 Post author: JonahSinick 26 June 2015 10:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (129)

You are viewing a single comment's thread. Show more comments above.

Comment author: minusdash 26 June 2015 07:24:29PM *  3 points [-]

What do you mean by getting surprised by PCAs? Say you have some data, you compute the principal components (eigenvectors of the covariance matrix) and the corresponding eigenvalues. Were you surprised that a few principal components were enough to explain a large percentage of the variance of the data? Or were you surprised about what those vectors were?

I think this is not really PCA or even dimensionality reduction specific. It's simply the idea of latent variables. You could gain the same intuition from studying probabilistic graphical models, for example generative models.

Comment author: RomeoStevens 26 June 2015 07:32:02PM 2 points [-]

Surprised by either. Just finding a structure of causality that was very unexpected. I agree the intuition could be built from other sources.

Comment author: minusdash 26 June 2015 07:46:35PM 6 points [-]

PCA doesn't tell much about causality though. It just gives you a "natural" coordinate system where the variables are not linearly correlated.

Comment author: VoiceOfRa 27 June 2015 01:29:52AM 2 points [-]

Right, one needs to use additional information to determine causality.