A large-scale project designed to assess reproducibility in preclinical cancer research has identified significant challenges associated with repeating other scientists’ work, renewing calls for increased transparency and data-sharing in the biomedical community.
The project, launched in 2013 by the nonprofit Center for Open Science (COS) in collaboration with the online research marketplace Science Exchange, attempted to reproduce key results from more than 50 high-impact studies published between 2010 and 2012.
Over the next eight years, the researchers managed to repeat experiments from a little under half of those studies, and found that the results they obtained were typically far less clear-cut than the ones reported in the original papers—an assessment that has drawn criticism from some of those papers’ authors.
For the remainder of the studies, the team often wasn’t able to obtain enough information about the methods used from either the papers or their authors, and had to abandon ...