Brazilian Scientists Evaluate Reproducibility in Biomedical Research

The Brazilian Reproducibility Initiative team could replicate less than half of regional research findings. How can scientists use these data to improve research replicability?

Written byShelby Bradford, PhD
| 6 min read
Illustration of three light orange targets aligned in a row. On the left, seven gray dots are grouped in the upper right quadrant. In the middle, six gray dots are spread around the target. On the right, five gray dots are grouped in the center of the target.
Register for free to listen to this article
Listen with Speechify
0:00
6:00
Share

Science suffers from a reproducibility crisis.1 Across disciplines, meta researchers—scientists who study the science of science—reproduce less than half of the results from biological science research.2,3 In response, meta researchers and basic research scientists are teaming up to tackle this problem. In Brazil, the Brazilian Reproducibility Initiative (BRI), a consortium established to evaluate data replicability, investigated biomedical research within the country. Their findings, published as a preprint on BioRxiv, highlight gaps in the science of research and solutions that can be implemented at a local scale.4

The Case for Improved Research Reproducibility

Clarissa Carneiro, a meta scientist and currently the co-executive director at the Brazilian Reproducibility Network, experienced the reproducibility challenge firsthand. As an undergraduate student studying pharmacy at the Federal University of Rio de Janeiro (FURJ), when she tried establishing a behavioral model from a publication to study neurobiological aspects of memory, “nothing worked,” she recalled.

“The first instinct is always to think that something’s wrong with me or with our institution,” she continued, adding that differences between the experimental animals and their housing infrastructure could cause variability. However, when she investigated further, she found papers from the pharmaceutical industry describing similar challenges in replicating academic findings.5,6

Although scientists know that poor reproducibility could delay advancing drug development, Carneiro explained that lack of good criteria to establish the validity of methods left academic and industry scientists helpless. This, Carneiro said, makes it harder to determine where problems lie in failed drug development and, more generally, in replicating academic research. She added that it’s currently difficult to evaluate if a scientist got a negative result because a drug actually doesn’t work or because that researcher didn’t complete the method the same way another group intended.

Continue reading below...

Like this story? Sign up for FREE Cell Biology updates:

Latest science news storiesTopic-tailored resources and eventsCustomized newsletter content
Subscribe

However, her interest in the question was at a more foundational level. “It’s about improving the methods that we use,” she said. “How we select those methods, how we report the methods, and the results. And it also has an impact on the communication between scientists [and] among scientists.”

At the time, she worked in neuroscientist Olavo Amaral’s group at FURJ. This reproducibility issue had caught Amaral’s attention in 2011 when he taught statistics to graduate students and later during his neuroscience research. “I think we’ve had reproducibility problems for a while now,” he said. “I don’t know whether this has always been with us or it’s a more recent problem. I think we have started to pay more attention to it recently.”

In 2017, with the goal of addressing this issue, Amaral transitioned to the field of meta research and started BRI, which Carneiro and others in Amaral’s team joined. As they undertook their reproducibility project, the BRI coordinating team consisted of ten researchers and trainees from across Brazil with expertise in multiple disciplines.

Establishing a Process to Replicate Research

When the BRI team sketched out their project, the goal of the one-shot project was to create a multicenter initiative to replicate experiments published from Brazilian researchers. This was in contrast to previous reproducibility efforts that evaluated methods across geographies but within one research area. “Doing it locally is interesting in the sense that it allows dialog with our immediate community,” Amaral said. “A lot of the changes needed depend on local funders and local institutions promoting change.”

A photograph of four researchers from the Brazilian Reproducibility Initiative. On the far right, Olavo Amaral, today a meta researcher at the Federal University of Rio de Janeiro, stands in a plaid shirt. Clarissa Carneiro, today the co-executive director of the Brazilian Reproducibility Network, stands second from the left, wearing square glasses and a yellow and red floral shirt and holding a rolled up poster.

The Brazilian Reproducibility Initiative attempted to replicate research findings from across Brazil.

Olavo Amaral

The group started by identifying experimental methods frequently used across biomedical science research by Brazilian researchers. Then, they put out calls to labs across the country to participate in the project by replicating experiments that used these techniques. Ultimately, 56 groups contributed at least one replication experiment.

The BRI coordinating team sourced more than 2,700 publications for experiments that they would ask their participating groups to replicate from a public database. Based on the groups that volunteered and their literature review for common research techniques, the BRI coordinating team settled on three methods to test for reproducibility: a cell metabolism test, a gene amplification assay, and a behavior analysis. The team intended to assess these methods across at least 20 experiments for each assay.

Carneiro transcribed each experiment’s method section from its original paper into a working protocol. To answer questions about gaps in the technical process, she asked the participating researchers for insights. However, before the replication studies could start, the COVID-19 pandemic shut down many labs for extended periods of time, during which some members of lab groups left and then new people took over the reproducibility work. “So, this other person now interpreted the [protocol] differently,” Carneiro said.

These instances were not unique; BRI found that experienced researchers often interpreted terminology in methods differently from their peers. Using an example of taking samples from tumor cell lines, Amaral explained, “If you have an n of three, what is three is not really clear. Is it like one cell culture that you will take different passages in different days and do the experiment, like in three completely independent instances? Or do you have three cultures, you subculture from each of them, and you do the whole screen in the same day. Or do you actually have one culture and you take little samples of this one culture and call this an n of three?”

This dilemma became more apparent after the groups completed the replication studies, in which they recorded their processes in as much detail as possible. Before analyzing the data, a group consisting of coordinating team members and participants from the replicating labs met to review the protocols and notes submitted by the teams that replicated the experiments to decide whether some submissions should be excluded based on the protocol deviations. “We lost almost a third of replications by doing this,” Amaral said. In the end, the team analyzed 97 replications from 47 individual experiments.

After this evaluation step, the BRI team found that the reproducibility rate across their three experiments was between 15 and 45 percent. Previous reproducibility studies yielded similar results.

Addressing Reproducibility in Research at Scale

A photograph of Clarissa Carneiro shows her wearing a blue and white shirt with a black blazer. She has red, square-framed glasses and is smiling. Carneiro is the co-executive director at the Brazilian Reproducibility Network.

Clarissa Carneiro explores reproducibility and how to improve it in Brazilian science as part of the Brazilian Reproducibility Network.

Diego Padilha/Instituto Serrapilheira

“This doesn’t mean that whatever is written there is wrong,” Amaral said. Instead, he said that it indicates there are gaps in how research is reported, such as details about laboratory environments and specific terminologies.

“The most interesting [thing] about the results is really the problems you faced along the way,” Carneiro said, highlighting the challenges in communication that they experienced. Amaral and Carneiro agreed that their study offers areas where research communication could be improved overall.

“The challenge ahead is to figure out how we can do this more efficiently as broader community,” Carneiro said. Next, the BRI team will explore specific challenges for each of the methods that the study assessed and discussing potential solutions to them.

Additionally, both Carneiro and Amaral continue their reproducibility work with the Brazilian Reproducibility Network, an initiative that invites researchers across the country to develop resources to promote research reproducibility.

  1. Korbmacher M, et al. The replication crisis has led to positive structural, procedural, and community changes. Commun Psychol. 2023;1(1):3.
  2. Errington TM, et al. Investigating the replicability of preclinical cancer biology. eLife. 2021;10:e71601.
  3. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349(6251):eaac4716.
  4. Brazilian Reproducibility Initiative. Estimating the replicability of Brazilian biomedical science. Biorxiv. 2025.04.02.645026.
  5. Prinz F, et al. Believe it or not: How much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712.
  6. Begley CG, Ellis LM. Raise standards for preclinical cancer research. Nature. 2012;483(7391):531-533.

Related Topics

Meet the Author

  • Shelby Bradford, PhD

    Shelby is an Assistant Editor at The Scientist. She earned her PhD in immunology and microbial pathogenesis from West Virginia University, where she studied neonatal responses to vaccination. She completed an AAAS Mass Media Fellowship at StateImpact Pennsylvania, and her writing has also appeared in Massive Science. Shelby participated in the 2023 flagship ComSciCon and volunteered with science outreach programs and Carnegie Science Center during graduate school. 

    View Full Profile
Share
You might also be interested in...
Loading Next Article...
You might also be interested in...
Loading Next Article...
Image of a man in a laboratory looking frustrated with his failed experiment.
February 2026

A Stubborn Gene, a Failed Experiment, and a New Path

When experiments refuse to cooperate, you try again and again. For Rafael Najmanovich, the setbacks ultimately pushed him in a new direction.

View this Issue
Human-Relevant In Vitro Models Enable Predictive Drug Discovery

Advancing Drug Discovery with Complex Human In Vitro Models

Stemcell Technologies
Redefining Immunology Through Advanced Technologies

Redefining Immunology Through Advanced Technologies

Ensuring Regulatory Compliance in AAV Manufacturing with Analytical Ultracentrifugation

Ensuring Regulatory Compliance in AAV Manufacturing with Analytical Ultracentrifugation

Beckman Coulter logo
Conceptual multicolored vector image of cancer research, depicting various biomedical approaches to cancer therapy

Maximizing Cancer Research Model Systems

bioxcell

Products

Sino Biological Logo

Sino Biological Pioneers Life Sciences Innovation with High-Quality Bioreagents on Inside Business Today with Bill and Guiliana Rancic

Sino Biological Logo

Sino Biological Expands Research Reagent Portfolio to Support Global Nipah Virus Vaccine and Diagnostic Development

Beckman Coulter

Beckman Coulter Life Sciences Partners with Automata to Accelerate AI-Ready Laboratory Automation

Refeyn logo

Refeyn named in the Sunday Times 100 Tech list of the UK’s fastest-growing technology companies