SEAN MCCABE
In September 2011, Ron Fouchier of Erasmus Medical Center in Rotterdam, the Netherlands, presented some shocking results at the annual conference of the European Scientific Working Group on Influenza: he and his colleagues had created a mutant version of the H5N1 avian flu virus that could be transmitted through the air between ferrets. Not long after, news began to circulate of a similar creation in the lab of Yoshihiro Kawaoka of the University of Wisconsin, Madison. Fouchier and Kawaoka submitted their studies for publication to Science and Nature, respectively, sparking a heated debate over the potential consequences of publishing such research, whether the risky viruses should have been created at all, and, of course, how comparable work should be regulated going forward. In the following article, scientists and policy experts on both sides of the divide discuss their opinions.
Post your questions to our expert panel by...
QUESTION 1: To Research or Not to Research?
The uproar surrounding the two recent H5N1 studies spotlights the issue of whether or not research on potentially dangerous lab-generated pathogens should have been conducted in the first place. What are the benefits, and do they outweigh the risks?
YESThere are several reasons why it is important to probe whether the H5N1 viruses can be modified to allow mammalian spread. The first is that we do not necessarily understand which of the viral gene products can be modified in ways that facilitate human-to-human spread. There’s also the issue of whether the particular genetic modifications that allow the switch to a different host are in some sense unique. If that (probably unlikely) scenario were to be the case, knowing the location and identity of the changed amino acids could potentially lead us very rapidly to a much deeper understanding of key structural events determining virulence for different species. Then, of course, the rapid sequencing techniques that have recently been incorporated into well-organized and globalized influenza surveillance efforts would allow us to screen readily for those mutations.
The question of “why do?” the Fouchier et al. and Kawaoka et al. experiments has, in some senses, already been answered. There had been a vigorous debate for years as to whether the H5N1 viruses could ever change in a way that would allow them to spread readily between mammals. We now know that answer, at least for ferrets. Also, we understand that such change will not necessarily lead to diminished pathogenicity. In addition, it seems likely that just a single mutation in one protein will not be sufficient to allow a switch in host range for the H5N1 viruses, and that different combinations of changes may bring us to the same end.
—Peter Doherty
NOI believe it was irresponsible to have performed this research in the absence a risk-benefit assessment. The research is likely to have no, or essentially no, practical benefits. Claims that the lab-generated transmissible H5N1 strains will provide potential benefits in terms of improved surveillance and response are hollow. First, there is no basis to believe that sequences of these strains will match sequences of potential future naturally transmissible H5N1 strains arising from natural selection in natural hosts. Second, even if the sequences did match, they would have minimal practical utility. There is no infrastructure for effective, comprehensive, sequence-based surveillance of H5N1 influenza virus strains, and in the absence of an approved, manufactured, and stockpiled vaccine, wild-type H5N1 influenza virus, there is no infrastructure or mechanism for effective response. The failure of surveillance and response in the 2009 H1N1 pandemic amply demonstrates that infrastructure and mechanisms for sequence-based surveillance and response are inadequate. The risk/benefit ratio is essentially infinite—high risk relative to zero or near-zero benefit.
Decisions not to perform specific proposed research projects, or to perform them only after modifications to mitigate risk, are routine. Every research project that involves vertebrate animals or human subjects undergoes a review in which risks and benefits are enumerated and weighed, and a decision is made whether to approve the project, to approve with modifications, or to deny approval. Research projects that involve the enhancement of a potential pandemic pathogen’s virulence, transmissibility, or ability to evade countermeasures should be subject to an analogous system of prior review.
—Richard H. Ebright
YESNature is full of surprises. Those surprises will inevitably comprise threats such as deadly new pathogens. Without preparation, we will be unable to understand or respond to new threats. So, yes, unequivocally, we must pursue research into pathogens in order to understand both the organisms and their interaction with immune systems.
—Rob Carlson
YESFundamental research cannot reasonably be prohibited simply because it enables potentially dangerous application. It almost always simultaneously enables beneficial results as well.
—John Steinbruner
NOThe most critical question about any inherently dangerous research is not, “What are the benefits?” It is rather, “Do benefits outweigh risks?” In this case, the answer to this question is “No.” The risk that an accidental escape could seed a pandemic with millions of deaths make a man-made pandemic seem much more likely than a natural one. Indeed, it would take extraordinary benefits and significant risk reduction with extraordinary biosafety measures to correct such a massive overbalance. So until adequate security measures are taken, such research should be discontinued.
—Lynn Klotz and Ed Sylvester
YESResearch on H5N1 viruses is essential to preparing for a natural pandemic. Since 1997, when the first human infections with H5N1 viruses occurred, there has been concern about the virus acquiring the ability to transmit from human to human. When this did not happen, many questioned the investment in H5N1 vaccine stockpiles and other countermeasures. To provide scientific evidence to inform policy on this issue, our goal in doing this research is to determine the pandemic potential of H5N1 viruses. By simulating evolutionary changes to the virus in the laboratory—safely under contained conditions—we can evaluate which changes enable H5N1 viruses to transmit in mammals. H5N1 virus transmission studies are considered research priorities by the World Health Organization and the National Institute of Allergy and Infectious Disease.
We discovered only a few mutations are needed to enable a virus with an H5 hemagglutinin (HA) to transmit between mammals, that antivirals and vaccines would be effective controls, and that the transmissible virus was not highly pathogenic. Importantly, we found that a subset of mutations responsible for transmission is already present in naturally occurring isolates. This means there is a significant risk that H5N1 viruses can acquire the additional mutations that may enable human-to-human transmission.
Preparations to prevent a pandemic or an epidemic could now be made based on these data. Vaccines are effective controls for these transmissible viruses, and the mutation data will help identify and evaluate candidate viruses to be used for stockpiled vaccines. Antivirals are also effective against the transmissible viruses, and public health officials in regions with circulating viruses known to have these mutations should seek to obtain antivirals and develop distribution strategies.
Another immediate benefit to public health is using the mutation data to compile an “alert list” for global surveillance teams. With additional research, we will be able to identify other naturally occurring mutations that confer this transmissible phenotype so that we have a comprehensive list for surveillance purposes. Statements in the news media regarding the extent of surveillance activities for animal influenza viruses in countries where the H5N1 viruses are currently circulating are inaccurate—surveillance teams in these countries are capable of detecting mutations, and the mutation data provided in both Dr. Fouchier’s study and in ours will enable on-the-ground monitoring for potentially threatening viruses. However, we agree that the surveillance and response infrastructure could and should be improved in these countries so that the information in our papers and other relevant data can be fully utilized.
Regarding the risks of this research, individuals quoted by the media have irresponsibly commented on “risks” without knowing the precautions and regulatory oversight in place at my lab and the lab in the Netherlands. Our research is conducted with the strictest attention to biosafety and biosecurity following the United States Select Agent guidelines. This was acknowledged at the recent WHO meeting in Geneva. Technical advisory boards that define these research requirements will be making recommendations to ensure that all researchers adhere to similar stringent standards.
—Yoshihiro Kawaoka
YESIt is a matter of trust, and research is based on trust. We trust the student who carries out the experiment; we trust the PI responsible for the studies; and we have to trust that the information that is obtained will be used for the greater good. Many times we have to convince other countries to be open, to share information, to trust us. If we stop trusting others, we will lose our ability to anticipate catastrophic events and to accumulate the information that we still very much need in order to minimize the consequences of such catastrophes. It’s been said that we cannot prevent the emergence of pandemic influenza strains, but we could certainly learn a lot about these viruses in order to anticipate and minimize the effects of such strains.
—Daniel Perez
QUESTION 2: To Publish or Not to Publish?
Whether or not H5N1 work continues as it has, the two studies in discussion are already complete. The question now is how to communicate those data. In December 2011, the National Science
Advisory Board for Biosecurity (NSABB) recommended that certain details be removed before publication to limit access to such potentially dangerous information. In February, a World Health Organization committee decided that the work should be published in full, and the NSABB was taking a second look at the issue when this article went to press. The scientific community remains torn. Will redacting details of the studies help prevent sensitive information from falling into the wrong hands—or is it, at this point, already too late?
NOHaving closely followed the reactions prior to and following publication of the NSABB perspective, I remain in full support of the NSABB position to redact the details of the recent H5N1 studies. We failed to properly prepare for the current problem set in the wake of the 2004 National Research Council committee report entitled Biotechnology Research in an Age of Terrorism, commonly known as the “Fink Report” after committee chair Gerald Fink of the Whitehead Institute for Biomedical Research, which made recommendations pertaining to the review of research of concern—now we are reacting to a crisis. Risk cannot be eliminated, but it can often be better managed, mitigated, and reduced. If so, as in this instance, the “advantage” goes to “the good guys,” at least for a while. Redacting is really about making it more difficult for those seeking to do high-impact harm by limiting access to the knowledge and materials that would enable them to achieve their goals and objectives. It is about “raising the bar” for them while pursuing legitimate science and other security measures that will better protect nations and societies.
As it is in science, security is not “all or nothing” or static, either in viewpoint or in practice. All useful perspectives must be thoughtfully engaged, considered, and balanced. The life science research community does not fundamentally understand, nor has it likely ever had to contend with, how the darkest adversaries think, plan, resource, and operate. Maximizing the likelihood that only responsible scientists have access to such discoveries and use them only for the advancement of global health is one “tool in the kit” to reduce risk. The misuse of new avian flu science might be low probability (at least for the short term), but the potential consequences could be massive. I suggest that we had better get it right; there will be no “do overs.” Selective redaction is not a total or perfect solution, but it could contribute to risk reduction if leveraged with other precautionary steps. Further, what is missing from the debate thus far are the “responsibility” and “accountability” issues, i.e., who will be held responsible and accountable if published science such as this is misused to enable a global health catastrophe? If we are counting solely on full and comprehensive global public-health preparedness—an essential and noble pursuit—we are, in reality, not even close to adequate protection from nature as well as from dark forces.
Author’s Note: I write from my own perspective, not from membership on the NSABB. I have formal training in the practice and management of life and physical sciences and engineering. I also have more than 30 years of actual operational and technical experience in national and global security, including with catastrophic terrorism, WMD terrorism, and biosecurity.
—Randall Murch
YESRestrictions on the publication of biomedical research are unprecedented. They also are unworkable, at least for research such as this that has already been completed and reported. The primary risks are accidental release through infection of a lab worker who then infects others or deliberate release by a disturbed or disgruntled lab worker. Efforts to restrict publication will do nothing to address these risks.
—Richard H. Ebright
NOTo be effective, rules regarding the publication of such research would have to be globally applied and thus would have to be developed and implemented by a globally representative organization. That arrangement is far beyond the current state of practice and the current horizon of prevailing attitudes. It is inherently feasible, however, and is ultimately likely to be necessary.
—John Steinbruner
YESWith the recommendation to withhold some details from publication, the NSABB is presently operating under the mistaken assumption that information, once created—and once advertised, as in this case—can be contained. Unfortunately, the proposal to allow secure access only to particular individuals is at least a decade (if not three decades) out of date.
Any attempt to secure the data would have to start with an assessment of how widely it is already distributed. In addition to the computers and e-mail servers at the institutions where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and possibly on the various e-mail servers and individual computers of the board members as well. And let’s not forget the various unencrypted phones and tablets all those reviewers now carry around. At this point, it would be remarkable if the information had not already been stolen.
But never mind that for a moment. Let’s assume that all these repositories of the relevant data are actually secure. The next step is to arrange access for selected researchers. That access would inevitably be electronic, requiring secure networks, passwords, etc. But think back over the last couple of years: hacks at Google, various government agencies, and universities. Credit card numbers, identities, and supposedly secret Department of Defense documents are all for sale on the Web. Hackers are always testing, and breaking, computer security systems.
That’s not to say that a case can’t be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest. But in such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security. In the present case, it is far too late. In fact, the hue and cry over the results from Kawaoka and Fouchier has only highlighted the value of the information, creating a perverse incentive to access those results.
Moreover, if that is true, then restriction of access serves only to slow down the general process of science and the development of countermeasures. Science is inherently a networked human activity that is fundamentally incompatible with constraints on communication. Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication. Censorship threatens not just science, but also our security.
The only way that potentially dangerous results can be effectively secured is either not to do the research or to refrain from discussing it publicly in the first place. Choosing not to do such research would leave us unprepared for emerging threats, while doing such research in secret leads to difficult questions about who should be in the know. Our only course is to do the research and to discuss it openly.
A different and longer version of Carlson's thoughts can be found on his blog.
—Rob Carlson
NOThe recommendation to redact part of a scientific manuscript was not something that was arrived at without serious deliberation about the potential benefits and risks of making the information widely available. The authors made two significant changes to the virus: they altered its transmission from fecal-oral to respiratory, and changed its host range. We know from recent experience that exposure of BSL-3 lab workers to other pathogens, although infrequent, happens. In addition, the experience with the post-9/11 anthrax mailings shows that it is possible someone may misuse this virus to cause fatalities. Finally, it is common knowledge that terrorist groups such as Al Qaeda are interested in biological agents. Combined with the fact that the technology to repeat the experiments is widely available—making influenza viruses from DNA clones, while not simple, is not beyond someone with basic knowledge of molecular biology and cell culture techniques—it seems prudent to follow the precautionary principle: given the uncertainties, it is better not to proceed. Redaction is not permanent, but allows us to stop, think, and release the details only when appropriate.
—Michael J. Imperiale
QUESTION 3: How to Regulate?
A global issue that stems from the ongoing H5N1 debate is how to regulate such research. Who should be in charge of granting approval for potentially dangerous studies? At what biosafety level should they be conducted? Who should have access to the full results? And how should all of this be organized and monitored?
A critically important question is, “Should the committee’s oversight decision on research proposals simply be a guideline or be backed by binding regulations?” Scientists, who are committed to maintaining considerable freedom in their activities, abhor regulations. But for research on live potential pandemic pathogens (PPPs), we support regulations.
Guidelines are often ignored. By researching PPPs in Biosafety Level 3 labs, many labs have ignored two sets of guidelines: the NIH guidelines on biosafety levels for pathogen research and the Fink guidelines for “experiments of concern.” The NIH guidelines clearly state, “Biosafety Level 4 is required for work with dangerous and exotic agents that pose a high individual risk of aerosol-transmitted laboratory infections and life-threatening disease that is frequently fatal, for which there are no vaccines or treatments, or a related agent with unknown risk of transmission.” And the Fink guidelines call for the implementation of a review system for experiments of concern, which include those that “would enhance the virulence of a pathogen...would increase transmissibility of a pathogen…would alter the host range of a pathogen.” The guidelines could not be clearer.
—Lynn Klotz and Ed Sylvester
Obviously there is some research that is more risky than others; however, fear of what could be done with the results should not be part of the research agenda if the research itself has the ultimate purpose of benefiting humankind. If we start scrutinizing the type of research that can be done, and weighing whether the research itself can be used to cause harm, then terrorists have already won; they have already succeeded in instilling fear in all of us.
—Daniel Perez
We should understand as we discuss this issue that human beings tend, like water, to follow the available, easy path. The more rigorous the safety constraints, the more cumbersome the procedures, the less science will be done. Of course, when working with dangerous human or veterinary pathogens, we operate with an absolute obligation to minimize risk. At the same time, it is important that the level of security imposed should be appropriate and not excessive.
—Peter Doherty
For lines of research that entail extreme levels of potential danger, it is appropriate, and indeed urgent, to impose mandatory oversight rules to assure that informed judgments are made regarding social consequence as well as scientific merit. It is natural to hope, but unrealistic to believe, that the danger revealed by the recent H5N1 experiments will immediately inspire significant official initiative. Major governments are already beleaguered by obviously defective regulatory practices involving more widely recognized sources of danger—financial transactions and nuclear explosives among them. They will predictably focus on natural evolution of the H5N1 strain and might marginally improve disease surveillance and epidemiological response measures already developed.
But that is not a valid excuse for complacency. Some prudential oversight of highly consequential biological research is currently practiced, but prevailing procedures are largely voluntary in character, are not consistently or comprehensively applied, and do not have global scope. For pathogens like a highly transmissible H5N1 virus, this is not sufficient.
There is good reason to fear that nothing will be done until some massive misfortune belatedly compels attention. Indeed, a few leading research scientists understand the implications of the situation, but most are categorically opposed to any mandatory intrusion into basic research. But it is also prudent to assume that protective regulation will ultimately be imposed. There is too much at stake for the impediments to prevail indefinitely. Whether comprehension evolves naturally or is forced by disaster, mandatory oversight will eventually be indispensable. The sooner and the more gracefully that is realized, the better off we all will be.
—John Steinbruner
It is tempting to suggest that restricting publication, or delegating responsibility to an international organization, could ensure that the details of research on H5N1 do not circulate widely. But globally, there is no international body that can effectively enforce such a requirement. That approach might therefore do little more than lend a false sense of security that the problem is being addressed, with less attention paid to building international consensus on best practice.
Indeed, lessons from the past have suggested that international regulation of genetic research would be a lengthy process and could even fail to come to easy agreement on verification procedures. More than 30 years of discussion on policies regarding biological weapons failed in the early 2000s to develop an internationally binding verification protocol under the Biological Weapons Convention. We do, however, have an opportunity to begin a process that could contribute to a solution. If and when consensual best practice guidelines are developed for this emerging science, voluntary and responsible adherence to best practice for legitimate research is within the realm of possibility.
Such was recently demonstrated by the 39 researchers from around the world that agreed, in an open letter to the journals Science and Nature, to a voluntary 60-day pause on any research involving highly pathogenic influenza H5N1 viruses that could lead to greater transmissibility, and by the World Health Organization, which convened discussions on this topic in mid-February. Discussions on best practices developed by such a group would take much longer to develop than the 60 days, but could eventually lead to consensus on internationally accepted best practice that could serve as a point of reference for guiding genetic research on the H5N1 virus.
And of course, though H5N1 is the present concern, the discussions should be broadened to all microbial research because it is clear that researchers are using similar techniques to modify and engineer other pathogens. In addition to de novo synthesis of a poliovirus early in this millennium, and the publication of the reconstructed genetic sequence of the 1918 pandemic influenza virus, genetic manipulation of viruses is a tool used for research aimed at developing new vaccines to prevent infections such as H5N1 and other as yet unknown pathogen threats.
—David L. Heymann
CONTRIBUTORS
Rob Carlson is a principal at Biodesic, an engineering, consulting, and design firm in Seattle. He is the author of Biology is Technology: The Promise, Peril, and New Business of Engineering Life.
Peter Doherty is an immunologist at the University of Melbourne, who shared the 1996 Nobel Prize in Physiology or Medicine for his work on immune recognition of virus-infected cells.
Richard H. Ebright is a chemistry professor and microbiologist at Rutgers University.
David L. Heymann is head and senior research fellow of the Chatham House Centre on Global Health Security.
Michael J. Imperiale is a professor and associate chair in the department of microbiology and immunology at the University of Michigan and a member of the US National Science Advisory Board for Biosecurity.
Yoshihiro Kawaoka is a virologist at the University of Tokyo and the University of Wisconsin-Madison and led one of the research teams that evolved influenza viruses possessing an H5 hemagglutinin (HA) to be capable of aerosol transmission between ferrets.
Lynn Klotz is a senior science fellow at the Center for Arms Control and Non-Proliferation in Washington, DC. With Ed Sylvester, he coauthored Breeding Bio Insecurity: How U.S. Biodefense is Exporting Fear, Globalizing Risk, and Making Us All Less Secure (University of Chicago Press, 2009).
Randall Murch is a Professor in Practice in the School of Public and International Affairs, Virginia Tech University, and a member of the National Science Advisory Board for Biosecurity.
Daniel Perez is a virologist at the University of Maryland.
John Steinbruner is a professor of public policy at the University of Maryland and Director of the Center for International Studies at Maryland.
Ed Sylvester is a professor of journalism at Arizona State University’s Walter Cronkite School of Journalism, where he teaches science and medical writing.
Interested in reading more?
