© BEN HULSE
This spring, ecologist Dan Blumstein’s research team headed out to field sites in Colorado to study yellow-bellied marmots (Marmota flaviventris), large mountain rodents related to ground squirrels. The team’s observations will help determine how the animals are responding to a major population crash that occurred in 2011. “This year there was not a lot of snowpack, and that acts as insulation for animals underground,” says Blumstein, a professor at the University of California, Los Angeles. “It’s always, ‘Are there going to be any animals?’”
The precipitous drop in marmot numbers came as a bit of a surprise. For 40 years, marmot populations in the Upper East River Valley of Colorado had fluctuated—from 100 animals, to 60, to 30, and back up again. But during the decade preceding the crash, a decade of shorter-than-average winters, the marmot population had surged. “It tripled in size and got up to 300 animals. We were sort of overrun with marmots,” says Blumstein. Then, after an unusually long winter in 2011, the population plummeted. Blumstein says the marmot study highlights the value of long-term data sets spanning decades. “Had we quit after the first 40 years, we would have had a different perspective on what’s possible,” he says.
“Long-term data sets create a certain data richness that we don’t get from big-data snapshots,” says Samuel Arbesman, a senior scholar at the Kauffman Foundation and a fellow at the Institute for Quantitative Social Science at Harvard University. The Framingham Heart Study, for instance, has been running for 65 years, and has yielded hundreds of findings on myriad diseases. But Arbesman says short-term, big-data studies are in vogue, and with limited funding opportunities, long-term studies are taking a backseat.
“It is exceedingly difficult to get funding for long-term research,” agrees Blumstein. His lab receives support from the National Science Foundation’s Long Term Research in Environmental Biology (LTREB) program. Its budget is only about $3 million, and it supports around 75 to 80 projects, each of which lasts 5 to 10 years. “We could use a lot more money to fund that kind of research,” says Saran Twombly, the program manager of LTREB. Another $30 million from NSF goes to fund research at ecological stations that conduct long-term studies, including 40 years of songbird observations in New Hampshire and a decades-long experiment involving the addition of fertilizer to an arctic stream.
RMBL MARMOT PROJECT“The biggest dilemma that NSF faces in these long-term projects is that everyone recognizes the value of long-term data, but our research environment is so competitive that the scientists have to be continually asking exciting new questions,” Twombly says. And that means that exploratory, open-ended, observation-type projects—the kind that can gather answers to questions that haven’t yet been asked—are much lower priorities than studies that fit into 2- or 3-year funding cycles.
In addition to the unappealing wait for long-term projects to bear any fruit, they are also a hassle. Consider the challenge of archiving, transferring, and analyzing data. In 1962, when the marmot population dynamics study began, data were recorded with pen and paper. Since then, various methods of data collection and storage have come and gone, as have the scientists who first worked on the project. It can be difficult to anticipate the data-handling methods that will be the most useful and accessible to future generations.
Arbesman says the other big difficulty in designing long-term studies is knowing what to go after. “You have to say, ‘OK, are we asking all the right questions that are not only going to be relevant now . . . but based on technological change in a hundred years, [will they] be able to actually answer the kind of questions that people will still be interested in?’” Blumstein says he wishes he could have anticipated the importance of hormones, for instance. “Wouldn’t it be great to have samples to look at those things over a longer timescale?”
Jamie Allan, the program director for the Integrated Ocean Drilling Program (IODP) at NSF, argues for casting a wide net around data sampling, rather than going after targeted questions. “One trend that probably went too far was a testing of specific hypotheses rather than asking broad questions,” he says. Allan adds there’s been an appreciation at IODP and NSF for more of this type of exploration, exemplified by a new, $400 million Ocean Observatories Initiative.
From long-term observational data, Allan says, “you can get all kinds of surprises.” Keir Becker, a professor at the University of Miami, has been coordinating research at subseafloor drilling sites. One borehole observatory off the coast of Japan has been recording pressure below the seafloor since 2001. After a 9.0-magnitude earthquake on March 11, 2011, the observatory—hundreds of kilometers away from the epicenter—measured a huge spike in pressure. The finding sheds light on how tectonic plates are moving. “It took us 10 years to get that signal,” Becker says. “It’s not something you can plan, because we can’t predict when an earthquake is coming.”