From extending lifespan to bolstering the immune system, the drug’s effects are only just beginning to be understood.
Neurons in new brains and old
October 1, 2015|
ANDRZEJ KRAUZEIt’s hard to wrap one’s mind around the human brain. With its 86 billion neurons, even greater numbers of glial cells, a quadrillion synapses, and millions of miles of axons, this intricate organ doesn’t readily reveal its inner workings. But its complexity hasn’t kept researchers from striving to sort out the details of the brain’s form and functions.
Fascination with the brain is age-old. Gross anatomical dissections, starting in about 280 BCE in Alexandria and (after more than a thousand years of prohibition) resuming in the Renaissance, segued into microscopic examination. Over the centuries, structural drawings by Camillo Golgi and Santiago Ramón y Cajal have morphed into diagrams of the connectome and super-resolution images and videos of neurons in action. Continuous development of new techniques now allows neuroscientists to probe ever deeper into how the brain works.
During the long history of neurobiology, dogmatic beliefs about the brain have arisen, only to be toppled by new findings. In our annual issue dedicated to neuroscience, two features describe such dogma-busting research. Neuroscientist Margaret McCarthy debunks the idea that male and female brains differ only in brain areas related to reproduction in “Sex Differences in the Brain.” Certain areas of male and female noodles differ significantly from fetal development right on through adolescence and into adulthood, and McCarthy explains that it’s not just the neuronal connections; the behavior of glial cells also differs between the sexes. The upshot is that research on brain function must include female as well as male subjects to fully understand the importance of such differences.
Another long-held belief that has bitten the dust in the last few decades is that adult human brains do not generate new neurons. True, most of our lifetime supply of neurons is produced before birth; they proliferate, in fact, so overexuberantly in the fetal brain that half of them die before we are born. As people age they continue to lose neurons, albeit at a far slower rate. But even as the adult brain loses neurons, we now know, it also gains new ones. In “Brain Gain,” Senior Editor Jef Akst reports on the role played by these new neurons, which are especially prominent in the hippocampus, a brain region vital to learning and memory. As one investigator puts it: “We think that [adult] neurogenesis provides a way, a mechanism of living in the moment. . . . It clears out old memories and helps form new memories.” The article gives a sense of the excitement felt by researchers eager to understand what these new neurons do and to possibly harness neurogenesis to ameliorate psychiatric disorders and neurodegenerative diseases.
That the brain’s glial cells—astrocytes, oligodendrocytes, and microglia—are no more than a support system for neurons is another belief that has also been upended. Glia myelinate axons, prune synapses, and perform valuable immune functions. A Lab Tools article, “Into the Limelight,” catalogs new techniques for the isolation and culture of glial cells, permitting gene-profiling studies, and methods for in vivo monitoring of astrocyte signaling. The article also touches on gliobiologists’ use of simpler model systems, such as flies, worms, and zebrafish, that have been so useful in studying neurons.
Handedness and language processing have long been thought to share a genetic basis because they are both highly lateralized in the brain. One Notebook article describes a study of multiple generations of 37 families of Dutch lefties that casts doubt on any genetic overlap, while another parses how the brain processes whistled language, in this case across-the-valley communications in a remote region of Turkey.
The brain still has many surprises to reveal. As more old, entrenched ideas about how the brain works are pruned away, a fuller understanding of this marvelous organ is bound to emerge.
Mary Beth Aberlin Editor-in-Chief