Ajit Varki and Danny Brower
Twelve, June 2013
Denial is not just a river in Africa, but the authors of this odd collaboration think it helped us cross one. Ajit Varki and the late Danny Brower acknowledge that our penchant for denying reality gets humans into constant trouble and could be the death of us; yet, they propose, without it we not only couldn’t survive—we would never have become fully human. How this book came to be is almost as unusual. Varki, a professor in UC San Diego’s School of Medicine, who found the first biochemical mutation distinguishing humans from chimps, was approached after a lecture by University of Arizona molecular biologist Brower, who shared his fascination with human origins. After Brower died suddenly, Varki completed Brower’s manuscript, which hypothesized that any species reaching a critical threshold in intelligence—“full theory of mind,” essential for communication, cooperation, and culture—must hit “The Wall”: the awareness of personal death. (“Others are like me; others die; I will die.”) That realization renders the best and brightest too dread-stricken to successfully reproduce. Only one species, Homo sapiens, found a work-around—make-believe, such as an afterlife—enabling us to break through. Convoluted and unprovable, the thesis is nonetheless intriguing.
By Leslie Valiant
Basic Books, June 2013
Computer scientist Leslie Valiant celebrates Alan Turing as the progenitor of a third scientific revolution, potentially as profound as Newton’s and Einstein’s in transforming our understanding of the world. Why not a “fourth revolution”—why omit Darwin? Because, Valiant dares to say, Darwin’s theory is radically incomplete, and until it is equipped to make quantitative, verifiable predictions, evolution by natural selection cannot account for the complexity of living things and is not “more than a metaphor.” But Valiant offers no drop of succor to creationists. Rather, he seeks to arm neo-Darwinian theory against their onslaughts by elucidating the mechanistic, quantitative basis it must have in a world “without a designer.” The algorithms of computational learning theory, he posits, will be key—in particular, a special kind he calls “ecorithms,” which incorporate information gathered from the environment to improve an organism’s “performance.” Turing’s heirs have only just begun to plot its equation.
By Suzanne Corkin
Basic Books, May 2013
“It’s a funny thing—you just live and learn. I’m living, and you’re learning.” Thus the most important patient in the history of neuroscience, Henry Molaison, described his decades-long collaboration with the scientists who learned volumes from his tragedy: the almost total loss, at age 27, of his ability to form new memories after bilateral temporal lobe surgery for epilepsy removed his hippocampi. Foremost among those scientists, Suzanne Corkin first tested Molaison in 1962 as a grad student at McGill University. Over the next 46 years, as Corkin, now professor emerita at MIT, conducted countless studies of Molaison’s wounded brain and the capacities it had lost or retained, she became the one lasting constant in his life, yet had to be reintroduced each time they met. Her book is a scientific and human monument, touching in its regard for the man (he had a sense of humor, as does she) and breathtaking in its detailed account of the discoveries about the localization and coordination of different aspects of memory made possible by refinements in brain-scanning technology and by Molaison’s untiring cooperation.
Allen M. Hornblum, Judith L. Newman, and Gregory J. Dober
Palgrave Macmillan, June 2013
The authors of this disturbing book identify three factors that allowed orphaned, troubled, and handicapped children to be routinely sought by scientists (even Jonas Salk) and “volunteered” by their institutional keepers as human guinea pigs during the 20th century—infected with hepatitis and ringworm, injected with experimental vaccines, dosed with plutonium and LSD. First was the eugenics craze that swept the West between 1890 and 1945, infecting even some eminent minds with the desire to cleanse society of “degenerates”—a catchall encompassing sufferers of Down syndrome and stuttering, epilepsy and alcoholism, cerebral palsy and poverty, blindness and every shade of nonwhite. Second was the urgency of World War II and then the Cold War—struggles for which everyone was expected to contribute and sacrifice, willy-nilly. And third was the burgeoning prestige of medical science, flush with government and corporate cash and revered by the culture, intoxicating its practitioners on altruism and ambition. Two things would have made this exposé stronger: a tighter focus on fewer examples, and less polemical editorializing: “shocking,” “cavalier,” “zealous,” “shenanigans.” The bare facts are deplorable; let the reader do the deploring.