The first draft sequence of the human genome, the culmination of more than a decade of scientific effort, was published more than 20 years ago. But it’s only now that researchers may finally have the sequencing and computational tools needed to sort through the complex, repetitive sequences that were left out of that draft. Although it only represents a small portion of the genome, this missing data has prevented scientists from fully understanding the genetic basis for traits and diseases, as Brianna Chrisman and Jordan Eizenga explained in our first feature in September. “From where we stand now, the future of the human reference genome looks bright,” they concluded.
In retrospect, it’s perhaps no wonder that Svante Pääbo was awarded the Nobel Prize for paleogenomics. The field of ancient DNA (aDNA) has exploded over the last decade, as advances in sequencing have made what were once considered impossible tasks—from high school students sequencing whole genomes to scientists obtaining informative reads from molecules that are more than 2 million years old—into reality. But many in the field worry that ethical frameworks meant to keep researchers in check have not kept pace with the technology at their disposal. As Keolu Fox, an Indigenous genomic anthropologist at the University of California, San Diego, told The Scientist earlier this year: “We really should be questioning the underlying ethics, because some research can be extractive and exploitative.”
To date, the bulk of genetic research has focused on DNA sequences. But the 3-letter amino acid code isn’t the genome’s only language: More and more, geneticists are finding that epigenetic instructions play essential roles in evolution and disease. Whether these instructions are passed down from generation to generation, however, remains hotly debated. Some scientists argue that exposures and experiences in one’s life can spur epigenetic changes that pass to the next generation, creating a harmful ripple effect. Others remain unconvinced by the evidence to date and say much more is needed to make such strong claims. Whichever side is right, determining how epigenetics shape phenotypes and their heritability will undoubtedly contribute to a deeper understanding of human health and disease.
One of the many insights gleaned from the increasing accessibility of whole genome sequencing is that the barrier between different species’ genomes is far less firm than scientists initially imagined. While the lateral transfer of DNA occurs far less frequently in animals, plants, fungi, and protists than it does in bacteria, it has become an irrefutable fact that it does happen—and that those horizontal gene transfers can have marked impacts on evolution. “We want to think not [just] in terms of number, but also in terms of impacts,” Université Paris-Saclay evolutionary biologist Clément Gilbert told The Scientist for the July feature. “Perhaps just one transfer may have had a huge impact on the viability of some species.”
The field of environmental DNA (or eDNA) has exploded in recent years, with scientists now able to spot all kinds of plants and animals from the tiny traces of themselves they shed every day. But while eDNA can reveal the presence or absence of species, it can only indirectly speak to the potential causes behind shifts in communities. Environmental RNA (eRNA), on the other hand, has the potential to reveal real-time impacts such as toxin exposures and environmental stresses—as long as the molecules aren’t too fragile to be recovered and sequenced. A November 18 bioRxiv preprint reported evidence that heat stress can be observed in water flea eRNA—arguably the first evidence that eRNA is, indeed, a viable field. “There’s a lot of attention on biological monitoring, and the kinds of ways we can quickly and efficiently and noninvasively collect data,” Joanne Littlefair, a molecular ecologist at the Queen Mary University of London, told The Scientist, so it’s a “good time” for eRNA to make the leap from theory to real-world application.