In 2009, Science published a paper linking chronic fatigue syndrome with the mouse virus XMRV, prompting a flurry of subsequent studies—none of which could replicate the findings. The paper was retracted last year. The following year, Science published a paper describing a strain of bacteria that incorporated arsenic instead of phosphorus into its DNA backbone, only to publish two studies refuting the findings this July. In this case, the journal has not asked the authors for a correction or retraction, citing the self-correcting nature of the scientific process.
And these high profile examples are by no means isolated incidents. In 2011, scientists at Bayer Healthcare in Germany recounted their dismal experience in trying to validate published research on new drug targets: in more than 75 percent of the 67 studies they attempted, Bayer’s labs could not replicate the published findings. This past March, researchers at Amgen reported a similar...
Indeed, published studies whose findings cannot be reproduced appear to be on rise, and while some such studies are later retracted, many stand, collecting citations, either because no one has tried to replicate the data, or those who have, successfully or not, cannot get their studies published. A new partnership by the start-up Science Exchange, an online marketplace for outsourcing experiments, and the open-access journal PLoS ONE hopes to address the issue of scientific reproducibility. Announced yesterday (August 14), the Reproducibility Initiative provides a platform for researchers to volunteer their studies for replication by independent third parties. Studies validated through the initiative will earn a certificate of reproducibility, similar to a Consumer Reports recommendation for a particular car model.
“We think that, long term, there will ultimately be a shift from rewarding highly unexpected results to rewarding reproducible, high-quality results that are really true,” said Elizabeth Iorns, a former breast cancer researcher and CEO of Science Exchange. Whether or not the new incentive system will have a broad impact on the scientific community, however, remains up for debate.
The Reproducibility Initiative takes advantage of Science Exchange’s existing network of more than 1,000 core facilities and commercial research organizations. Researchers submit their studies to the initiative, which then matches the studies with qualified facilities that will attempt to replicate the studies for a fee. The pilot program is accepting 40–50 studies, with preference given to preclinical studies that have translational value. Submitting researchers will have to pay for the replication studies, which Iorns estimates might cost one-tenth that of the original study, as well as a 5 percent transaction fee to Science Exchange. Participants will remain anonymous unless they choose to publish the replication results in a PLoS ONE Special Collection later this year, which will include overall statistics on the rate of replication.
“We can’t oblige anyone to publish anything,” said Damian Pattinson, the executive editor of PLoS ONE, though he stressed, “If you can’t reproduce the study, it’s very important that people know that.”
The new initiative, he added, is in line with the journal’s record of publishing studies replicating previous findings or presenting negative data—the kind of research often ignored by prominent journals. The current incentive structure for scientific research “pays limited attention to replication and more attention to innovation and extravagant claims,” agreed John Ioannidis, a professor of medicine, health research and policy, and statistics at Stanford University and a scientific advisor to the Reproducibility Initiative. As a result, researchers are pressured to pursue novel areas of research, undermining the self-correcting nature of science cited by some as the solution to the irreproducibility problem.
Even researchers with no intent for fraud or misconduct are pressured to cut corners or succumb to bias. In 2005, Ionnadis published a widely read paper in PLoS Medicine suggesting that most published scientific findings are actually false. According to Ionnadis, many false positives stem from researchers hunting for statistically significant results with little regard for the likelihood of the relationship being tested. “As long as journals and reviewers are seeking to publish the ‘perfect story,’ investigators are almost subconsciously persuaded to select their best data for manuscript submission,” said Lee Ellis, a cancer researcher at the MD Anderson Cancer Center and a member of the Reproducibility Initiative’s scientific advisory board.
Researchers say they are eager to see how the Reproducibility Initiative, which appears to be the first of its kind, plays out, and if it will begin to make a dent in the rising number of irreproducible results in the published literature. With its self-selecting participants, the Reproducibility Initiative is unlikely to uncover scientific misconduct, but may offer incentives for more rigorous research practices. “It will be very interesting to see what kind of teams would be interested in submitting their studies for replication,” said Ioannidis.
Iorns suspects that researchers trying to license their discoveries to industry might find a certificate of reproducibility particularly useful. In the long term, added Ellis, perhaps high impact journals will ask for proof of reproducibility prior to publication. “If studies are required to be validated, then perhaps the highest impact journals will more likely publish articles that are likely to impact the lives of our patients,” he said.
“In theory it’s a good idea,” agreed Arturo Casadevall, a microbiologist and immunologist at the Albert Einstein College of Medicine, who is not involved with the initiative. That said, he could not envision his own lab participating in the Reproducibility Initiative. “I just don’t see your average scientist running a lab on a very tight budget having the money to have experiments done elsewhere.” Still, he added, “anything out there that tries to improve scientific integrity has to be looked on positively.”