Advertisement
NeuroScientistNews
NeuroScientistNews

The Long View

In the era of Big Data, research projects that focus on phenomena that unfold across decades have distinct benefits—and some drawbacks.

By | July 1, 2013

BACK OFF: A mother marmot with her pups makes an alarm call.© BEN HULSE
This spring, ecologist Dan Blumstein’s research team headed out to field sites in Colorado to study yellow-bellied marmots (Marmota flaviventris), large mountain rodents related to ground squirrels. The team’s observations will help determine how the animals are responding to a major population crash that occurred in 2011. “This year there was not a lot of snowpack, and that acts as insulation for animals underground,” says Blumstein, a professor at the University of California, Los Angeles. “It’s always, ‘Are there going to be any animals?’”

The precipitous drop in marmot numbers came as a bit of a surprise. For 40 years, marmot populations in the Upper East River Valley of Colorado had fluctuated—from 100 animals, to 60, to 30, and back up again. But during the decade preceding the crash, a decade of shorter-than-average winters, the marmot population had surged. “It tripled in size and got up to 300 animals. We were sort of overrun with marmots,” says Blumstein. Then, after an unusually long winter in 2011, the population plummeted. Blumstein says the marmot study highlights the value of long-term data sets spanning decades. “Had we quit after the first 40 years, we would have had a different perspective on what’s possible,” he says.

“Long-term data sets create a certain data richness that we don’t get from big-data snapshots,” says Samuel Arbesman, a senior scholar at the Kauffman Foundation and a fellow at the Institute for Quantitative Social Science at Harvard University. The Framingham Heart Study, for instance, has been running for 65 years, and has yielded hundreds of findings on myriad diseases. But Arbesman says short-term, big-data studies are in vogue, and with limited funding opportunities, long-term studies are taking a backseat.

“It is exceedingly difficult to get funding for long-term research,” agrees Blumstein. His lab receives support from the National Science Foundation’s Long Term Research in Environmental Biology (LTREB) program. Its budget is only about $3 million, and it supports around 75 to 80 projects, each of which lasts 5 to 10 years. “We could use a lot more money to fund that kind of research,” says Saran Twombly, the program manager of LTREB. Another $30 million from NSF goes to fund research at ecological stations that conduct long-term studies, including 40 years of songbird observations in New Hampshire and a decades-long experiment involving the addition of fertilizer to an arctic stream.

MARMOT TRAP: Blumstein and his team capture marmots, collecting data for the long-term study of their population fluctuations.RMBL MARMOT PROJECT“The biggest dilemma that NSF faces in these long-term projects is that everyone recognizes the value of long-term data, but our research environment is so competitive that the scientists have to be continually asking exciting new questions,” Twombly says. And that means that exploratory, open-ended, observation-type projects—the kind that can gather answers to questions that haven’t yet been asked—are much lower priorities than studies that fit into 2- or 3-year funding cycles.

In addition to the unappealing wait for long-term projects to bear any fruit, they are also a hassle. Consider the challenge of archiving, transferring, and analyzing data. In 1962, when the marmot population dynamics study began, data were recorded with pen and paper. Since then, various methods of data collection and storage have come and gone, as have the scientists who first worked on the project. It can be difficult to anticipate the data-handling methods that will be the most useful and accessible to future generations.

Arbesman says the other big difficulty in designing long-term studies is knowing what to go after. “You have to say, ‘OK, are we asking all the right questions that are not only going to be relevant now . . . but based on technological change in a hundred years, [will they] be able to actually answer the kind of questions that people will still be interested in?’” Blumstein says he wishes he could have anticipated the importance of hormones, for instance. “Wouldn’t it be great to have samples to look at those things over a longer timescale?”

Jamie Allan, the program director for the Integrated Ocean Drilling Program (IODP) at NSF, argues for casting a wide net around data sampling, rather than going after targeted questions. “One trend that probably went too far was a testing of specific hypotheses rather than asking broad questions,” he says. Allan adds there’s been an appreciation at IODP and NSF for more of this type of exploration, exemplified by a new, $400 million Ocean Observatories Initiative.  

From long-term observational data, Allan says, “you can get all kinds of surprises.” Keir Becker, a professor at the University of Miami, has been coordinating research at subseafloor drilling sites. One borehole observatory off the coast of Japan has been recording pressure below the seafloor since 2001. After a 9.0-magnitude earthquake on March 11, 2011, the observatory—hundreds of kilometers away from the epicenter—measured a huge spike in pressure. The finding sheds light on how tectonic plates are moving. “It took us 10 years to get that signal,” Becker says. “It’s not something you can plan, because we can’t predict when an earthquake is coming.”

Advertisement

Add a Comment

Avatar of: You

You

Processing...
Processing...

Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo

Comments

Avatar of: Paul Stein

Paul Stein

Posts: 119

July 30, 2013

I didn't see any "drawbacks" in this article, just difficulties.  Having seen a number of data handling technologies, software and hardware, come and go in rapid succession over the past forty years, there is much to say about the durability of pen-and-paper laboratory notebooks as well as microfiche. 

Avatar of: Jim Steele

Jim Steele

Posts: 1

July 30, 2013

I agree wholeheartedly with the need for long term data sets. Random weather events often create spurious correlations, and the affect of 20 to 60 year cycles like the Pacific Decadal Oscillation will affect microclimates in various ways. We can not evaluate the effects of climate change without studeis that encompass those long term cycles.

After 15 years of monitoring songbirds we first thought a population decline was due to climate change but as time went on we realized the problem was waterhsed degradation. It would have been very valuable to continue monitoring the resiliency of those populations once the restoration was complete. However after a few years of documented success, the funding was pulled. Monitoring is simply not politically  "exciting." I suspect this induces researchers to exaggerate certain issues to help attract funding, and overstate the imporatnce their results  to continue funding. Funding long term monitoring is essential and simply  acknowledges we are a long way from knowing everything. The questons and insights that arise from monitoring will evoke better research questions.

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
Advertisement
Life Technologies