Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

The Journal of (Failed) Replication Studies

11 Post author: Vladimir_Gritsenko 23 August 2009 09:15AM

One of Seed Magazine's "Revolutionary Minds" is Moshe Pritsker, who created the Journal of Visualized Experiments, which to me looks like a very cool idea. I imagine that early on it may have looked somewhat silly ("he can't implant engineered tissue in a rat heart and he calls himself a scientist?!"), so it's nice to know JoVE is picking up pace.

Many folks keep pointing out how published research is itself biased towards positive results, and how replication (and failed replication!) trumps mere "first!!!11" publication. If regular journals don't have good incentives to publish "mere" (failed) replication studies, why not create a journal that would be dedicated entirely to them? I can't speak about the logistics, but I imagine it can be anything from a start-up (a la JoVE) to an open depository (a la arxiv.org).

I am not part of academia, but I understand that there are a few folks here who are. What do you say?

[EDIT: Andrew Kemendo notes two such journals in the comments: http://www.jnrbm.com/ and http://www.jnr-eeb.org/index.php/jnr.]

Comments (14)

Comment author: AndrewKemendo 23 August 2009 12:18:57PM *  5 points [-]

If regular journals don't have good incentives to publish "mere" (failed) replication studies, why not create a journal that would be dedicated entirely to them?

There are a few Journals of negative results already out there:

http://www.jnrbm.com/

http://www.jnr-eeb.org/index.php/jnr

Comment author: Vladimir_Gritsenko 23 August 2009 02:34:35PM 1 point [-]

Cool, thanks! (Also, Google-fu fail on my part.)

One other journal I just found (although no publications there yet): http://www.arjournals.com/ojs/

If this is representative, then it's both encouraging (at least a few folks are taking the problem seriously) and discouraging (they're too few). At least now there's something concrete to evangelize :-)

Comment author: cousin_it 23 August 2009 01:10:49PM *  0 points [-]

A novel negative result isn't the same as failing to replicate a study published by someone else.

Comment author: Vladimir_Gritsenko 23 August 2009 02:37:21PM 1 point [-]

At least in the second journal (of ecology and evolutionary biology), they do say they accept replication studies.

Comment author: Nick_Tarleton 23 August 2009 04:28:47PM *  3 points [-]

Many folks keep pointing out how published research is itself biased towards positive results, and how replication (and failed replication!) trumps mere "first!!!11" publication.

Clarity check: "trumps" = "is (normatively) more important than"?

Also,

("he can't implant engineered tissue in a rat heart and he calls himself a scientist?!")

will be really confusing if/when that entry drops off the front page.

Comment author: Vladimir_Gritsenko 24 August 2009 10:50:29AM 0 points [-]

Clarity check: "trumps" = "is (normatively) more important than"?

Yes.

will be really confusing if/when that entry drops off the front page.

Hehe :-) if you propose a less confusing quip, I'll edit it in.

Comment author: Christian_Szegedy 24 August 2009 02:49:22AM *  2 points [-]

The current system of scientific publishing is clearly outdated.

There are issues with the biases, but also with for-profit nature of the system that charges huge sums of money for accessing the work of researchers funded by public grants and reviewed by researchers for free, financed by public money.

Then add to this all those phony journals, that are theoretically peer reviewed, but have very low (or non-existent) standards and accept everything to make money or simply exist because some big-pharma company uses them to publish skewed tests to get FDA approvals, etc.

I think one or two additional special-purpose journals would not really change the landscape.

IMO, what we need is a complete modern infrastructure based on state of the art IT/social networking. One that allows the review of articles even after they officially appeared with an elaborate voting system that factors in the credibility of the reviewers, It should make it possible to add (publish) refutations and the publication of positive or negative attempts of replications, etc and organize the articles with their support/refutations/endorsements in an easily accessible database.

Ideally, such a system could work both as a rating and publication medium, but with the current scientific publishing lobby, it would not have much chance to take off. The only chance is to do this by extending an existing meta-system (e.g. citeseer) with a general discussion/rating/publishing forum, that would allow the publication of critics/refusals/extensions of existing papers maybe even in a peer-reviewed manner.

In the field that I work, I see that the scientific community discusses and generally supports such changes and given all the efforts and progress of the last decade I'd be surprised if we won't see such (or similar) one or more systems emerging in the next 10 years.

Comment author: Vladimir_Gritsenko 24 August 2009 10:17:15AM 0 points [-]

Yes, that would be better, but as yourself note, it's a big change that's unlikely to happen in one go. On the other hand, specialized journals are not a novelty, and considering that at least some folks took that specific specialization up, it appears to be more an issue of advertising than invention.

But nobody said this problem should be attacked on just one front. More (different) attempts mean more chances of success, no?

Comment author: Christian_Szegedy 24 August 2009 06:38:38PM *  0 points [-]

The coolest thing about the visualized experiment journal is that it exploits current computer technology to extend the scope of what a scientific publication means. Provide a new channel to communicate ideas on a higher bandwidth using the new but cheaply and generally available infrastructure of the net.

I agree that starting a journal like you mentioned can't do any harm.

Still, I think that for the specific purpose you have in mind (replication studies,critics, follow-up) a technologically more advanced solution would be essential. The reason is that most of the studies would be attributes on existing publications and therefore an easily accessible database structure would make scientific discourse much more fluid and transparent. Checking articles for replicated results, criticisms would become much easier and therefore pushing the authors to higher standards, also exposing fake research and journals.

The necessary technology for that does not include much 21st century stuff. A system simpler than the imdb of the 90ies combined with some off-the-shelf social networking framework would easily do the trick. Since there are lot of existing journal databases, I am pretty sure we are going to see several alternative solutions emerging in the next few years for the exact same purpose. In fact, we can already see that to some extent.

I would also see some value of combining a traditional peer reviewed journal structure with such a system to boost credibility of both the system and the journal.

My general opinion is that scientific publishing (more so than popular literature or newspapers) is at the brink of a huge paradigm shift. Just entering the field with an old-fashioned stuff that does not look forward technologically is dead end IMO.

Comment author: cousin_it 24 August 2009 09:48:45AM *  0 points [-]

I think one or two additional special-purpose journals would not really change the landscape. IMO, what we need is a complete modern infrastructure based on state of the art IT/social networking.

You may be right, but beware of throwing cold water on Vladimir's idea. It might just work. After all, arXiv is a simple website without bells and whistles, and look how much impact it had.

Comment author: PhilGoetz 25 August 2009 02:57:49AM *  1 point [-]

I think you're focusing on the wrong part of the right problem. Studies that attempt to replicate someone else's work, but produce negative results; are much more common than studies that provide negative results using a method that hasn't been published before.

Comment author: MichaelBishop 23 August 2009 10:49:48PM *  0 points [-]

I think the bigger problem is not that people are unable to publish failed replications, but that people don't even try to replicate because there is little prestige or funding for it. I think we need greater rewards for successful and unsuccessful replications, and in the latter case, greater negative consequences for the people who did the original work.

Having a journal for replications (why exclude successful replications?) might help, but in my opinion something more dramatic will be needed.

If you want food for thought, see Robin's paper

Comment author: Jonathan_Graehl 23 August 2009 11:03:25PM 4 points [-]

This just came up on HN: How To Publish a Scientific Comment in 123 Easy Steps (pdf)

Ideally, an author can be notified of their mistake and publish a retraction (it should be their responsbility), but it seems that some authors would rather defend work they know is flawed, or do nothing and hope their mistake goes unnoticed.

Comment author: Kaj_Sotala 28 August 2009 10:35:26AM 0 points [-]

Thank you for the link. I forget when I would have laughed this much the last time.