WIKICOMMONS, NICKLAS BILDHAUERDespite a push for transparency in science, full data disclosure may be close to non-existent among published studies. Of 441 randomly selected biomedical research papers analyzed in a new study, none provided access to all the authors’ data. And only one of these papers shared a complete protocol. The results of this analysis, which could shed light on science’s reproducibility problem, were published today (January 4) in PLOS Biology.
“What was most surprising to me was the complete lack of data-sharing and protocol availability,” said study coauthor John Ioannidis, a professor of medicine and health research and policy at the Stanford University School of Medicine. “That was worse than I would have predicted.”
“This study confirms what most of us already know—that the current clinical research enterprise is set up in a way that researchers consider data their own assets,” said cardiologist Harlan Krumholz,...
“It is commendable of the authors to put this study together,” said bioethicist Arthur Caplan of New York University who also was not part of the work. “It’s unique and a useful baseline of transparency, conflict of interest, and reproducibility [in the biomedical literature],” he added.
Ioannadis’s team examined a random selection of papers published between 2000 and 2014 and deposited into PubMed Central. Among the 441 PubMed-indexed papers analyzed, 268 included empirical data. Of these 268 (including 15 papers reporting clinical trial results), only one disclosed a complete study protocol: a paper on a clinical trial protocol. None of these 268 papers provided means to access full datasets; only one mentioned making complete raw data available upon request.
For Doug Altman, a professor of statistics in medicine at Oxford University, U.K, who was not involved in the work, the study’s design makes the findings difficult to interpret. “The sample covers such a long period that I’m not convinced gives a good picture of current practices, as the authors claim,” Altman wrote in an email to The Scientist. “And the heterogeneous sample probably disguises major variations across types of research and possibly also across specialties.”
Funding disclosure was spotty at best within the study sample. Half (51.7 percent) of the 441 publications did not divulge any information on funding sources, the researchers found. “Having that many researchers having nothing to say about funding is not consistent with the current patterns of funding. About 55 percent of biomedical research is now privately, industry-funded,” Caplan told The Scientist. “Disclosure of funding is 100 percent on the journals—that should be a condition of publication.”
More than half of the 441 publications (69.2 percent) did not include conflict of interest statements. Still, the percentage of papers that included conflict of interest statements increased from 2000 to 2014, Ioannidis noted, indicating that more journal editors have more explicit policies for authors’ disclosures.
Of the 268 studies containing empirical data, just four (1.5 percent) were replication studies; eight articles (3.1 percent) noted that the results had been replicated in subsequent papers. “This is not surprising because most journals don’t publish replication studies unless they are of monumental significance,” said Caplan.
“We have continuing evidence that when different people view the same data, they can come to different conclusions,” said Krumholz. “It’s as if there is just one telescope and one person looking through it who tells everyone what he sees. But it’s plausible—even likely—that when others look in the telescope with the same data, they will summarize it in a different way. So there is a real importance of allowing others to see what you have done.”
Authors alone aren’t to blame for the observed lack of transparency among published papers, said Caplan. “It’s journal editors, funders, technology transfer officers at schools, and authors,” he said. “There are a lot of players in the world of transparency and disclosure.”
“What jumped out here is the uniquely broad sampling across a range of journals [that] revealed . . . extremely low data-sharing,” said Brian Nosek, a professor of psychology at the University of Virginia and executive director of the Center for Open Science who was not part of the new study. “That shows how deep this problem is. All the talk about data sharing and improving reproducibility . . . has not yet translated into pervasive action across the journals.”
Krumholz believes the issues can be resolved. “There is a growing momentum and appreciation for the importance of open science and recognition that our scientific enterprise may not be serving the public interest by sequestering data,” he said. “We now need to find solutions, tools and structures and funding.”
Meantime, Ioannidis said he and his colleagues will continue to track the literature for progress to “see how big of a change we can achieve over time from this low starting point.”
S.A. Iqbal et al., “Reproducible research practices and transparency across the biomedical literature,” PLOS Biology, doi:10.1371/journal.pbio.1002333, 2015.