As the COVID-19 pandemic has unfolded, scientific experts and policymakers have had to work together closely to keep up with the latest science and implement policies to prevent further transmission of the virus. Meanwhile, the pace at which COVID-19 research has been published has led misinformation to spread rampantly, sometimes working its way into the upper echelons of government.
To see how well policymakers worldwide incorporated solid COVID-19 research into their decisions, researchers examined scientific citations within 37,725 policy documents spanning 114 countries and 55 intergovernmental organizations that were drafted between early January and late May of last year. In a study published in Science yesterday (January 7), they found that these documents referenced COVID-19 papers that had, on average, 40 times more citations than other COVID-19 papers. Peer-reviewed papers from leading scientific journals were far more represented in policy documents than were preprints, the researchers found.
The Scientist spoke with Northwestern University economist Benjamin Jones, a coauthor of the new study, about why these results surprised him.
The Scientist: What was your motivation for conducting this study?
Benjamin Jones: First, conceptually, there’s a long-standing set of questions and concerns about the extent to which science is drawn upon in policy—does policy pay attention to science at all? But also, does it pay attention to good science? That goes back to C.P. Snow’s examination of World War II in the importance of science in the war efforts and who got listened to. [Editor’s note: Chemist and novelist C.P. Snow wrote The Two Cultures, an influential essay in which he lamented the cultural divide separating the arts and sciences.] So I think there’s a broad set of questions and concerns about how policy draws on science, and that’s been long studied.
The second channel is that Big Data has come to research on all sorts of questions. We have increasingly exploited these incredible datasets of every scientific publication. And more recently we’re seeing the attempt to see how science is being drawn upon in other kinds of domains. I’ve done some work on how science is used, for example, in patenting and we’re seeing work on how science is used in the media, but now we’re actually beginning to see better and better data opportunities for how science is referenced in policy.
We had a new dataset, Overton, that allowed us to come into that, and then of course we have COVID on the mind, so that’s a chance to really see this interaction in real time.
TS: So you had this giant database with tens of thousands of papers. How did you analyze them?
BJ: In some sense, we’re doing something very basic—the very first thing you would do—which is we see this explosion of work in science about COVID, and then we have the data with pretty much real-time policy documents that are being produced. It was a chance to see if all the policy that has been developed around COVID is in fact aware of and referencing all of this science related to COVID. We’re basically merging this giant global database of policy documents [with] this very large database of evolving science. We can both see if it’s linked, and then we can see, is it linked to the good science or less-vetted science, and we can see who’s doing the linking—which institutions in the policy domain seem to be important for bridging into that science and bringing those ideas and insights into policy.
TS: We’ve all seen misinformation spread very quickly during the pandemic. Did you find that less-rigorous science impacted policymakers?
BJ: The classic idea is that [policymakers] are just a very different community of people with different expertise. Does the average policymaker understand physics, chemistry, or infectious disease biology? Probably not, so how do they even know what to cite and adjudicate? And in the current social media context or certain political contexts [where] there’s misinformation, one’s even more concerned that people aren’t even trying to access the right knowledge, they’re trying to select whatever makes their point as opposed to thinking about what good science is saying.
We were quite positively surprised that the number one finding is both that all of these new policy documents are able to interface with the new science—they’re not ignoring it; they’re drawing on it—and more importantly, they seem to be drawing on the high impact science—the stuff that scientists themselves recognize as important, peer-reviewed, and appearing in the vetted places.
We were quite positively surprised that the number one finding is both that all of these new policy documents are able to interface with the new science—they’re not ignoring it; they’re drawing on it—and more importantly, they seem to be drawing on the high impact science.—Benjamin Jones, Northwestern University
TS: What was your most surprising finding?
BJ: Beyond the fact that there’s this strong link with quality, which is comforting, I was struck by how important the World Health Organization appears in the picture. What we saw is that different national governments have different tendencies to draw on science directly. But in general, governments draw on it less, and these international intergovernmental organizations like the WHO draw on it correspondingly far more. In fact, a lot of the way that the science is getting into policy documents at the national level is indirect, so it’s basically the World Health Organization, say, will write something, drawing out all this latest science, and then other governments will cite this WHO policy document as opposed to citing the science directly.
That’s interesting because obviously we see in the pandemic different countries have taken very different approaches, and we see that in the data—there’s enormous heterogeneity in the use of science, so not everyone’s listening. The good stuff is getting into policy, but that doesn’t mean all policymakers or governments are listening. We see that the places like the WHO, which of course has been pilloried by some political actors in this pandemic, clearly play a very central role. I mean they really are the center of that whole network drawing science into policy.
TS: Preprint servers have played a big role for disseminating new scientific findings quickly during the pandemic. How were those studies used in policy?
BJ: We’ve seen some [preprints] that have gotten a lot of attention in the media and then turned out maybe not to be right. One thing we wondered within that question of what kind of science gets drawn on was whether the policy community would do the same thing and dive in on these brand new papers that weren’t vetted. That might not be a bad thing because speed matters in the pandemic.
But what we really see very strongly is that of all the COVID stuff going up on preprint servers, that stuff has a far, far, far lower chance of being drawn on. The policy documents really strongly emphasize the peer-reviewed work that’s been accepted. That’s interesting—I think it suggests that journals are continuing to play a very important role in helping policymakers select what work to draw upon.
TS: Looking forward to future potential public health emergencies, what lessons do you think we can learn from the data that you’ve analyzed?
BJ: The roadblock is not that policymakers just don’t know about science necessarily or can’t know about it. The roadblock is not that they’re unable to figure out what the right science is—when they cite it, they cite the good stuff. It seems like the roadblock to effective use of science, therefore, in policy is going to depend a lot on whether people are really listening in different governments and different parts of the government. It suggests that there is an idiosyncratic nation-specific roadblock to the use of science in policy. And that of course comports with what we’re seeing and have seen throughout this pandemic.
Y. Yin et al., “Coevolution of policy and science during the pandemic,” Science, doi:10.1126/science.abe3084, 2020.
Editor’s note: The interview was edited for brevity and clarity.