cupholder comments on Even if you have a nail, not all hammers are the same - Less Wrong

95 Post author: PhilGoetz 29 March 2010 06:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (125)

You are viewing a single comment's thread. Show more comments above.

Comment author: Matt_Simpson 29 March 2010 09:22:24PM *  4 points [-]

better methodology would have been to use piecewise (or "hockey-stick") regression, which assumes the data is broken into 2 sections (typically one sloping downwards and one sloping upwards), and tries to find the right breakpoint, and perform a separate linear regression on each side of the break that meets at the break. (I almost called this "The case of the missing hockey-stick", but thought that would give the answer away.)

An even better methodology would be to allow for higher order terms in the regression model. Adding squared terms, the model would look like this:

or

This would allow for nice those nice looking curves you were talking about. And it can be combined with logistic regression. Really, regression is very flexible; there's no excuse for what they did.

Also, the scientists could have done a little model checking. If what Phil says about the U/J shaped response curve is true, the first order model would have been rejected by some sensible model selection criterion (AIC, BIC, stepwise selection, lack-of-fit F test, etc)

related side note: In my grad stat classes, "Linear Regression" usually includes things like my example above - i.e. linear functions of the (potentially transformed) explanatory variables including higher order terms. Is this different from the how the term is widely used?

unrelated side note: is there a way to type pretty math in the comments?

followup question: are scientists outside of the field of statistics really this dumb when it comes to statistics? It seems like they see their standard methods (i.e., regression) as black boxes that take data as an input and then output answers. Maybe my impression is skewed by the examples popping up here on LW.

Comment author: cupholder 30 March 2010 04:44:57AM 1 point [-]

related side note: In my grad stat classes, "Linear Regression" usually includes things like my example above - i.e. linear functions of the (potentially transformed) explanatory variables including higher order terms. Is this different from the how the term is widely used?

I don't think it is. I seem to remember reading in Wonnacott & Wonnacott's textbook that you can still call it 'linear regression' whether or not one of those regressors is a nonlinear function of another.

That makes sense intuitively, because a linear regression algorithm doesn't care where your regressors come from, so conceptually it's irrelevant whether they all turn out to be different functions of the same variable (for example). (Barring obvious exceptions like your regressors all being linear functions of the same variable, which would of course mess up your regression.)

unrelated side note: is there a way to type pretty math in the comments?

I don't know of one, but I haven't been here long!

followup question: are scientists outside of the field of statistics really this dumb when it comes to statistics?

My understanding is, a lot of them aren't...but a lot of them are.