A study published last week expands the redesign of the 4-billion-year-old genetic code from a four-nucleotide base-pair alphabet to an eight-base-pair alphabet by incorporating artificial nucleotides. The scientists, led by Steven Benner of the Foundation for Applied Molecular Evolution and Firebird Biomolecular Sciences in Florida, have also identified a bacteriophage RNA polymerase variant that transcribes the synthetic DNA into synthetic RNA.
Previously, a scientific team led by Floyd Romesberg at The Scripps Research Institute used its own synthetic base pairs to create bacteria that replicate the artificial DNA, translate the DNA into mRNA, and create new types of amino acids.
These are technologically impressive accomplishments and the translational applications of these discoveries could be revolutionary. In their paper, Benner and colleagues discuss the potential to enhance the power of biocomputers logarithmically to encode videos or to create RNAs as catalytic biologicals. Romesberg has written about the possibility of incorporating his six-base synthetic DNA into a functional PCR assay or using it to create a new, sensitive test for viruses in the blood. He has started a company call Synthorx to develop this and other products to treat autoimmune diseases and cancer. The firm’s technology may also offer a unique opportunity to create new vaccines or novel organisms for bioremediation.
To recognize these technological achievements in all their complexity requires an acknowledgement of possible ethical boundaries to their uses. One of the remarkable features of nucleic acid biology is the capacity of every living organism to correct most but not all errors in normal germ-line DNA replication. The remaining errors are the source of the novelty that natural selection depends upon for the survival of some members of a species when circumstances may require novelty as a sustaining mutation.
The accuracy of DNA replication is quite high because replicative DNA polymerases efficiently select correct nucleotides for the polymerization reaction and exercise DNA repair, via pre-replicative and post-replicative methods using nucleotide-excision, base-excision, and mismatch repair processes. Even bacteria possess effective DNA repair processes. It is estimated that one wrong nucleotide is incorporated only once per 108–1010 nucleotides polymerized. In the absence of DNA repair, more than 6,000 mistakes per human genome per cell division would occur, which could be disastrous, if unchecked. Until we know that cells can scan and repair mutations in synthetic DNA it would be premature to create organisms using synthetic DNA.
An obvious step that could be taken quickly would be to require synthetic-genetic-code biologists to be under current government biohazard standards and regulations . . . and to call for international agreement on such regulations, perhaps through the United Nations.
Natural selection with DNA repair has proven to be selective in generating genetic diversity to maintain life on earth with our four-base nucleotide system. Introducing structurally deep chemistry changes in DNA within living systems could generate unpredictable and possibly lethal outcomes by allowing natural selection to proceed through a competition between current life forms and new ones using modified genetic codes.
The second concern is the potential of this technology to create organisms that could be used as bioterror weapons. Dual-use research—biological experiments with legitimate scientific goals that may be misused to pose a threat to public health or national security—is a serious concern. While this ethical concern is not restricted to synthetic DNA, we hope that governments that now monitor whether groups are purchasing potential viral sequences that could be used to create bioterror weapons will recognize that the capacity to incorporate synthetic DNA into virulent pathogens is a new source of potential dual-use weaponry.
We urge a convening of a commission of scientists and the agencies that fund their research, to discuss the matter of whether or not some new lines of work using modified genetic codes should be suspended until their long-term safety can be analyzed under strictly controlled conditions. Such a commission can have a positive effect on eventual research using a new technology, as evidenced by the Asilomar conference of 1975, when for the first and to date the only time scientists self-regulated their research, in examining gene transfer in bacteria. (One of us, R.P., initiated the call to organize the meeting at Asilomar that led to the guidelines of self-regulation.)
As self-regulation along the lines of Asilomar might not be effective these days given the privatization and global reach of much of medical science, an obvious step that could be taken quickly would be to require synthetic-genetic-code biologists to be under current government biohazard standards and regulations if they are conducting a synthetic biology experiment with live organisms, and to call for international agreement on such regulations, perhaps through the United Nations.
Should there be a genetic firewall in research? Do we want our descendants to see who wins in a future competition between artificial and natural DNA-based life forms? There is a conceptual difference between reprogramming existing genetic components using gene-editing technologies and using synthetic DNA to redesign living organisms. In sum, we are concerned that unregulated research in synthetic DNA may lead to an inadvertent disaster, and we strongly believe that in research and medicine, your actions should be guided not by what you can do, but rather what you should do.
John D. Loike, a professor of biology at Touro College and University Systems, writes a regular column on bioethics for The Scientist. Robert Pollack is a professor of biological sciences at Columbia University.