Science suffers from a reproducibility crisis.1 Across disciplines, meta researchers—scientists who study the science of science—reproduce less than half of the results from biological science research.2,3 In response, meta researchers and basic research scientists are teaming up to tackle this problem. In Brazil, the Brazilian Reproducibility Initiative (BRI), a consortium established to evaluate data replicability, investigated biomedical research within the country. Their findings, published as a preprint on BioRxiv, highlight gaps in the science of research and solutions that can be implemented at a local scale.4
The Case for Improved Research Reproducibility
Clarissa Carneiro, a meta scientist and currently the co-executive director at the Brazilian Reproducibility Network, experienced the reproducibility challenge firsthand. As an undergraduate student studying pharmacy at the Federal University of Rio de Janeiro (FURJ), when she tried establishing a behavioral model from a publication to study neurobiological aspects of memory, “nothing worked,” she recalled.
“The first instinct is always to think that something’s wrong with me or with our institution,” she continued, adding that differences between the experimental animals and their housing infrastructure could cause variability. However, when she investigated further, she found papers from the pharmaceutical industry describing similar challenges in replicating academic findings.5,6
Although scientists know that poor reproducibility could delay advancing drug development, Carneiro explained that lack of good criteria to establish the validity of methods left academic and industry scientists helpless. This, Carneiro said, makes it harder to determine where problems lie in failed drug development and, more generally, in replicating academic research. She added that it’s currently difficult to evaluate if a scientist got a negative result because a drug actually doesn’t work or because that researcher didn’t complete the method the same way another group intended.
However, her interest in the question was at a more foundational level. “It’s about improving the methods that we use,” she said. “How we select those methods, how we report the methods, and the results. And it also has an impact on the communication between scientists [and] among scientists.”
At the time, she worked in neuroscientist Olavo Amaral’s group at FURJ. This reproducibility issue had caught Amaral’s attention in 2011 when he taught statistics to graduate students and later during his neuroscience research. “I think we’ve had reproducibility problems for a while now,” he said. “I don’t know whether this has always been with us or it’s a more recent problem. I think we have started to pay more attention to it recently.”
In 2017, with the goal of addressing this issue, Amaral transitioned to the field of meta research and started BRI, which Carneiro and others in Amaral’s team joined. As they undertook their reproducibility project, the BRI coordinating team consisted of ten researchers and trainees from across Brazil with expertise in multiple disciplines.
Establishing a Process to Replicate Research
When the BRI team sketched out their project, the goal of the one-shot project was to create a multicenter initiative to replicate experiments published from Brazilian researchers. This was in contrast to previous reproducibility efforts that evaluated methods across geographies but within one research area. “Doing it locally is interesting in the sense that it allows dialog with our immediate community,” Amaral said. “A lot of the changes needed depend on local funders and local institutions promoting change.”

The Brazilian Reproducibility Initiative attempted to replicate research findings from across Brazil.
Olavo Amaral
The group started by identifying experimental methods frequently used across biomedical science research by Brazilian researchers. Then, they put out calls to labs across the country to participate in the project by replicating experiments that used these techniques. Ultimately, 56 groups contributed at least one replication experiment.
The BRI coordinating team sourced more than 2,700 publications for experiments that they would ask their participating groups to replicate from a public database. Based on the groups that volunteered and their literature review for common research techniques, the BRI coordinating team settled on three methods to test for reproducibility: a cell metabolism test, a gene amplification assay, and a behavior analysis. The team intended to assess these methods across at least 20 experiments for each assay.
Carneiro transcribed each experiment’s method section from its original paper into a working protocol. To answer questions about gaps in the technical process, she asked the participating researchers for insights. However, before the replication studies could start, the COVID-19 pandemic shut down many labs for extended periods of time, during which some members of lab groups left and then new people took over the reproducibility work. “So, this other person now interpreted the [protocol] differently,” Carneiro said.
These instances were not unique; BRI found that experienced researchers often interpreted terminology in methods differently from their peers. Using an example of taking samples from tumor cell lines, Amaral explained, “If you have an n of three, what is three is not really clear. Is it like one cell culture that you will take different passages in different days and do the experiment, like in three completely independent instances? Or do you have three cultures, you subculture from each of them, and you do the whole screen in the same day. Or do you actually have one culture and you take little samples of this one culture and call this an n of three?”
This dilemma became more apparent after the groups completed the replication studies, in which they recorded their processes in as much detail as possible. Before analyzing the data, a group consisting of coordinating team members and participants from the replicating labs met to review the protocols and notes submitted by the teams that replicated the experiments to decide whether some submissions should be excluded based on the protocol deviations. “We lost almost a third of replications by doing this,” Amaral said. In the end, the team analyzed 97 replications from 47 individual experiments.
After this evaluation step, the BRI team found that the reproducibility rate across their three experiments was between 15 and 45 percent. Previous reproducibility studies yielded similar results.
Addressing Reproducibility in Research at Scale

Clarissa Carneiro explores reproducibility and how to improve it in Brazilian science as part of the Brazilian Reproducibility Network.
Diego Padilha/Instituto Serrapilheira
“This doesn’t mean that whatever is written there is wrong,” Amaral said. Instead, he said that it indicates there are gaps in how research is reported, such as details about laboratory environments and specific terminologies.
“The most interesting [thing] about the results is really the problems you faced along the way,” Carneiro said, highlighting the challenges in communication that they experienced. Amaral and Carneiro agreed that their study offers areas where research communication could be improved overall.
“The challenge ahead is to figure out how we can do this more efficiently as broader community,” Carneiro said. Next, the BRI team will explore specific challenges for each of the methods that the study assessed and discussing potential solutions to them.
Additionally, both Carneiro and Amaral continue their reproducibility work with the Brazilian Reproducibility Network, an initiative that invites researchers across the country to develop resources to promote research reproducibility.
- Korbmacher M, et al. The replication crisis has led to positive structural, procedural, and community changes. Commun Psychol. 2023;1(1):3.
- Errington TM, et al. Investigating the replicability of preclinical cancer biology. eLife. 2021;10:e71601.
- Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349(6251):eaac4716.
- Brazilian Reproducibility Initiative. Estimating the replicability of Brazilian biomedical science. Biorxiv. 2025.04.02.645026.
- Prinz F, et al. Believe it or not: How much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712.
- Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. 2012;483(7391):531-533.