In 2016, a Nature survey of 1,576 researchers revealed that more than 70 percent of them had tried and failed to reproduce another scientist’s experiments—and more than half failed to replicate their own. These and other recent findings on the lack of reproducibility in scientific research have inspired the creation of groups such as the UK Reproducibility Network (UKRN).
Launched in March 2019, the UKRN is an interdisciplinary consortium that aims to tackle this issue in order to bolster research quality. Last month, 10 UK universities became part of the UKRN, joining a network that already includes stakeholders such as the Academy of Medical Sciences, Research Libraries UK, the National Institute for Biological Standards and Control, journals including Nature and PLOS, and local networks of researchers, reports Times Higher Education.
The Scientist spoke with Marcus Munafò, a biological psychologist at the University of Bristol and the chair of the UKRN’s steering group of researchers, about UKRN’s structure, activities, and future plans.
TS: There’s been a lot of talk about the reproducibility crisis over the past few years. Could you give our readers some background about what led to the creation of UKRN?
Marcus Munafò: I’m not sure I particularly like the crisis narrative. There’s been a lot of interest in whether or not the research that [people] do is as robust and replicable as it could be, and it’s healthy to reflect on whether or not we could do better. I think any enterprise should have some proportion of its effort invested in thinking about whether or not it can improve the way in which it works. So it’s much better to think of this in terms of that kind of framing.
We’re now at the point of thinking, what could we change about the ways in which we work that might improve the quality of what we do? How can we conduct research into whether or not, for example, data sharing improves the efficiency of knowledge transfer, or improves quality, or whatever it might be? When you start thinking in that way, you realize that you need to engage with a range of different component parts of the research system, because it is a very complex, interconnected global system. The scientific research effort includes academia, industry, funders, publishers, institutions, and individual researchers, and what we try to do with UKRN is to bring together those different elements of the research system.
TS: How would you describe the UKRN’s structure?
MM: Our structure broadly has three layers to it. We have grassroots local networks at individual institutions that are predominantly early- and mid-career researcher-led; they’re very informal and self-organizing and they run reproducibility journal clubs, they run open-research working groups. At the top, we have a stakeholder group; we have funders and publishers and other sectoral organizations who are interested in ensuring that what they publish or fund is high quality. And then the third part of the structure, in the middle, are the institutions themselves.
UKRN is a very informal organization in many ways. . . . We’re not about telling people what to do. We’re about trying to coordinate [and] broker conversations, harmonize as much as possible, and really focus on integration and coordination, both vertically and horizontally.
TS: How are researchers investigating reproducibility?
MM: There’s a growing subdiscipline of meta-research, or research on research, using scientific tools to understand how the research process itself works. Not a new thing, but it’s grown in prominence over the last ten or fifteen years, particularly since John Ioannidis’s paper, “Why Most Published Research Findings Are False,” in PLOS Medicine in 2005 that stimulated interest in this idea of using the scientific method to interrogate research practices itself. There are now hundreds and hundreds of papers in that space.
Now, we’re starting to see [research] looking at some of the innovations introduced. . . . Lots of journals are offering a registered report format now, which is one of the initiatives that we support—this idea that you submit your work to a journal before you’ve collected any data, and you’re offered in principle acceptance on the basis of the importance of the research question and the robustness of the methodology, not on whether or not the results are exciting or not, as you might describe them. There have been some studies looking at the difference between registered reports and conventional article formats . . . . It’s an attempt to evaluate what the impact of working in these new ways might be on the quality of research that we produce. [Editor’s note: this study in PLOS Biology is an example.]
But, of course, one of the things we need to also be doing in the background is thinking about the wider research culture, because any one of these initiatives on its own is not going to have a huge impact. What we need is to have a portfolio range of different initiatives that are coordinated and built into a wider discussion around how we actually update our research culture.
Many of the skills that we need to train people in are things that are not necessarily even mainly part of academic training.
TS: What kind of training is the UKRN involved with?
MM: I’m at a workshop at the moment that we hold in Cumberland Lodge in Windsor Great Park, which has 30 early-career researchers, typically PhD students and postdocs, being trained in things like reproducible workflows and data curation, [which] allow people to go back to their data from six months to twelve months or two years ago and very rapidly recapitulate the results that they generated.
Many of the skills that we need to train people in are things that are not necessarily even mainly part of academic training. Academics tend to get trained in the core skills of doing research as part of their PhD, but they often have relatively little training on things like project management and leadership. So we span quite a broad range of elements ranging from specific technical skills to broader skills that are more transferable.
TS: What future plans are in store?
MM: One of the things we need to do is to manage our growth, because we’ve actually moved incredibly fast. For example, getting universities to sign up formally to the UKRN was one of our year three goals, and we already achieved it in year one. So things are moving a lot faster than we expected, which is exciting. But . . . we’re still active researchers, so we have to manage that growth quite carefully. Ultimately, what we want to do is to get more universities to formally sign up within the UK.
The other thing that’s exciting is that we’ve had interest from other countries about the possibility of setting up a similarly structured network. . . . There aren’t many structures like ours that include these different levels and focus on the broader perspective, and I think that linkage between the funders, publishers, institutions, and researchers themselves is potentially quite powerful. If we can get similarly structured networks set up in other countries, then we’ll start to have a framework that would really allow us to tackle this problem at the global level.
One of the guiding principles of how we work is that we try to be transparent. We’re very happy to share all of our material. If anyone’s interested in setting up a similarly structured network in their country, then of course, we’d be happy to support that. It’s been interesting, the different approaches that we’ve [seen]. In some countries, it’s been the researchers themselves trying to create that structure from the ground up. In other countries, it’s been more led by funders who are interested in creating a similar structure from the top down. So there are different ways of doing it. Every country, every discipline, every university, every department has its own distinct culture, so the solutions that we that we come up with need to be flexible enough to adapt to the local needs and demands.
See “The Open Data Explosion”
Emily Makowski is an intern at The Scientist. Email her at firstname.lastname@example.org.
Editor’s note: the interview was edited for brevity.