FLICKR, GLEN EDELSON
Within the burgeoning field of synthetic biology, teams of biologists and engineers are making great strides in understanding the cell and its functioning. (See The Scientist’s recent feature on the topic.) However, there is more that should be discussed than the triumphs. There are also the dark purposes to which science (and synthetic biology in particular) can be put. Worries range from the development of pathogenic bioweapons to the potential contamination of native gene pools in our environment. The question is, are scientists responsible for the potentially negative impacts of their work?
Some have argued that the answer to this question is no—that it is not researchers’ responsibility how science gets used in society. But that is sophistry. Scientists are responsible for both the impacts they intend and some of the impacts they do not intend, if they are readily foreseeable in specific detail. These are the standards to which we are all held as moral agents. If I were to negligently throw a used match into a dry field (merely because I wanted to dispose of it), for example, I would be responsible for the resulting wild fire. In contrast, Einstein was not responsible for the use of his E=mc2 equation to build an atomic bomb and its use in wartime, though the scientists at Los Alamos were.
Of course, impacts (whether harmful or beneficial) are not solely scientists’ responsibility—others involved will also bear responsibility for their actions. If scientific knowledge is used in a biological attack, the terrorists are first and foremost responsible for their heinous act. But the researchers who generated the knowledge may be also partly responsible. Consider, for example, the knowledge of how to build a virus like smallpox from the ground up or how to create other pathogenic, tailored organisms—targeted either to humans or the foods on which we depend. If it is readily foreseeable that such knowledge could be used for nefarious purposes, the scientists who introduce such new technological capacities are partially responsible for an attack that could ultimately cause millions of deaths.
Scientists can no longer hope naively that people will only use science for the public good. The world will always have the mentally unbalanced, the delusional, the vicious, and the sociopathic members of society, some of whom will also be intelligent enough to use the results of science. Recognizing this should be part of the everyday backdrop of science, the assessment of its potential, and the desirability of the pursuit of a particular project.
As scientists plumb the depths of the cell, they must be particularly cognizant of the potentially harmful uses of their work, in addition to all its intended benefits. For example, knowledge of how to generate specific strings of nucleotides with high precision greatly aids research by providing particular and accurate DNA sequences with which scientists can assess cell functioning and design new living systems. But such knowledge can also produce the raw materials for building known pathogens from scratch, as has already been done (for research purposes) with the polio virus and the Spanish flu virus. As scientists develop ways to generate sequences of base-pairs ever more cheaply and efficiently, the opportunity for the malicious or the simply unreflective to play with pathogens to see what kind of traits arise looms larger. And it is not just technological know-how that can be problematic. The detailed knowledge of cellular or genetic functioning can have worrisome implications as well. Knowledge of what makes a virus more transmissible can assist us in detecting when a virus might be more prone to producing an epidemic, but it could also be used to make viruses more virulent.
In sum, scientists are responsible for both what they intend to achieve and that which is readily foreseeable, as we all are. There is nothing inherent in becoming a scientist that removes this burden of responsibility. The burden can be shared—scientists can come together to decide how to proceed, or ask for input from ethicists, social scientists, even the broader public. Alternatively, scientists could decide (and it has been proposed) that some forms of regulation—either in the selection of projects or in the control and dissemination of results—be imposed on the field of synthetic biology, to reduce the risks. The more oversight scientists submit to, the less responsibility they bear, but it comes at the cost of the freedom to choose the type of work they can do and how they do it. This is the essential tension: as long as there is freedom of research, there is the responsibility that comes with it.
Heather E. Douglas is the Waterloo Chair in Science and Society at the University of Waterloo. She earned her PhD in History and Philosophy of Science in 1998 from the University of Pittsburgh, where she is currently a visiting associate professor. Her book, Science, Policy, and the Value-Free Ideal, published in 2009, examines the moral responsibilities of scientists.