In one study, replicators copied the experiments from a 2011 Nature paper in which researchers used a drug to impede the growth of leukemia cells. In both studies, the inhibitor worked for a particular cell line and not another, yet unlike the original study, the replication attempt did not see increased survival in treated mice.
“Differences between the original study and this replication attempt, such as different conditioning regimens and doses [of the drug], are factors that might have influenced the outcome,” the replication team writes in its report.
Tony Kouzarides of the University of Cambridge who led the original Nature study, tells ScienceInsider that the discrepancy between the two attempts “highlights the pitfalls of biological research, namely, that different labs may vary conditions that affect the outcome of a given experiment.”
The other Reproducibility Project study replicated experiments from a 2010 Cancer Cell study examining genetic mutations underlying acute myeloid leukemia. For the most part, the replicators arrived at similar results, and they note in their report that subsequent studies have also backed up the 2010 conclusions.
Other replication attempts from the Reproducibility Project have not always come to the same results as the original papers. One published in January, for instance, did not find that an antibody therapy for breast cancer shrunk tumors in mice as the original authors observed. Another problem encountered by the replication team was that many of the tumors regressed spontaneously.
“This is an extremely important effort. Even though the published results pertain to only a small set of the larger project, the picture is convincing that reproducibility in cancer biology is very difficult to achieve,” Stanford University School of Medicine’s John Ioannidis, who was not involved with the project, told The Scientist in January. “I see the results not in a negative way . . . but as a reality check and as an opportunity to move in the right direction, which means more transparency, more openness, more detailed documentation of [methodology], and more honesty with ourselves.”