Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Douglas_Knight comments on She Blinded Me With Science - Less Wrong

13 Post author: Jonathan_Graehl 04 August 2009 07:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (38)

You are viewing a single comment's thread.

Comment author: Douglas_Knight 04 August 2009 08:47:23PM 7 points [-]

I suspect that many authors are hesitant to subject themselves to the sort of scrutiny they ought to welcome.

Normative language ("ought") is not helpful here. Journals that nominally require publication of data or calculations don't enforce it, either.

One way to deal with selection bias and fraud that I have occasionally seen, and only in economics and parapsychology ("the control group for science"), is to compare the effect size to the study size. If it's a real effect, it will not depend on the study size. But if it's fake, it will always just barely be statistically significant and thus it will decline with study size.

This kind of meta-analysis come from not trusting one's peers. This is rude, hence rare. But it's a lot more useful than pooling the data, the usual meta-analysis.

Comment author: Eliezer_Yudkowsky 04 August 2009 10:04:13PM 10 points [-]

The obvious solution, IMO, is to have journals approve study designs for publication in advance, including all statistical tools to be used; and then you do the study and run the preselected analysis and publish the results, regardless of whether positive or negative.

But just like many other obvious improvements we can all think of to the process of science, this one will not be carried out.

parapsychology ("the control group for science")

Did you get that off me? I was planning a post on it at some point or another.

Comment author: bentarm 05 August 2009 01:46:31PM 3 points [-]

That's the obvious brute force solution, but a possibly more elegant route is just to have an international trials register. This suggestion has been around for a while, and should be significantly less costly (and controversial) than the pre-commit to publishing route while still giving some useful tools for checking on things like publication bias, double publication, etc.

Comment author: Douglas_Knight 05 August 2009 05:02:40AM 1 point [-]

But just like many other obvious improvements we can all think of to the process of science, this one will not be carried out.

To a certain extent, it is being carried out for drug studies, but it requires centralization. At least, various central authorities have promised to require some pre-registration, but they may fail, as in the data availability story. Individuals can do meta-analyses that are skeptical of the original publications. and they do, on special occasions.

I think I've heard the line about parapsychology as a joke in a number of places, but I heard it seriously from Vassar.

Comment author: Jonathan_Graehl 04 August 2009 10:52:57PM 0 points [-]

have journals approve study designs for publication in advance, including all statistical tools to be used; and then you do the study and run the preselected analysis and publish the results, regardless of whether positive or negative

Brilliant.

Maybe a notary service for such plans would become popular from the ground up. Of course, to get voluntary adoption, you'd have to implement a guarantee of secrecy for a desired time period (even though the interests of science would be best served by early publicity, those scientists want their priority).

Let's see, just the right protocol for signing/encrypting, and ... never mind, it will never be used until some high status scientists want to show off ;)

Comment author: CronoDAS 04 August 2009 09:22:08PM 1 point [-]

Parapsychology: The control group for science.

Excellent quote. May I steal it?

Comment author: Tyrrell_McAllister 04 August 2009 09:26:28PM 0 points [-]

It's too good to ask permission for. I'll wait to get forgiveness ;).