Article Extras

Related Articles

Video Extra: Visit our homepage for a video of Harvard Medical School's R. Clay Reid demonstrating the microscale organization of visual nerve cells in the living animal brain

The Hum and the GenomeAs genome sequence data accumulates exponentially, the infrastructure that handles it all also needs to break new ground(Stuart Blackman)

Making Biological Computing SmarterTools for thought in the age of biological knowledge(Nina Fedoroff, Steve Racunas and Jeff Shrager)


The workshop, cosponsored by Redwood Neuroscience Institute (RNI) of Menlo Park, Calif., was sparked by the impending arrival of large-scale multiple-neuron recording, which forces us to imagine a new era of theoretical/empirical exchange in neuroscience. Neuroscience's history has too often afforded examples of grand theoretical speculations essentially immune to empirical test. Moreover, experiments have harvested floods of data describing infinitesimal parts of the brain's total activity (e.g., one neuron out of ten billion, or the average activity level of one million neurons together). Thus, theory and experiment have often gone their own ways.

At times theory and experiment have come together successfully. Donald Hebb first proposed the basic physiological mechanism underlying learning in 1949.2 Although it stood for decades without experimental support, his basic idea under the names of short- and long-term potentiation is now supported by a wide array of experiments.34

But that's not the norm. The experimental paradigm until now has been one of struggle for a handful of data, which are then exhaustively analyzed within the laboratory that managed to collect them. Generally independent data analysts and theorists have had little access to important datasets. At the MSRI/RNI workshop, one theoretical neuroscientist commented over coffee about the circular trap blocking interaction: "Theorists can't get data to compare with theories, because experimentalists don't care about theoretical work. They [experimentalists] don't know of many success stories from collaboration ..., because theorists didn't have access in the past."

Arguably, the major issue dividing theorists and experimentalists in recent years has been whether information processing in the brain is carried on by single cells, each representing one ingredient of a thought and communicating with each other by their firing rates; or whether neuronal assemblies with integrated firing patterns carry on the computational process. The former is often called the "grandmother neuron" theory after Horace Barlow's semifacetious proposal that somewhere in your brain, there should be a neuron which fires precisely when you are looking at your grandmother. The latter theory is associated with Hebb and also with the proposal that synchrony between neurons or precise firing differences should carry information. This issue has split the experimental and modeling communities. When only a small random subset of the neurons in a given area of the brain can be studied at once, there has been little chance of making a definitive experimental test of this issue.


Enter the era of large datasets, collected with a density and depth previously unimaginable: multineuron image sequences, multineuron spike sequences, and spike-triggered statistical analyses. Simultaneously recording 1,000–10,000 neurons in a column on millisecond time scales is not out of reach. Suddenly, the possibility of detecting cell assemblies with stereotyped firing patterns seems realistic.


© 2005 Nature Publishing Group

Single-cell calcium imaging in vivo, as shown here in the rat cortex, might elucidate the relationships between anatomy and physiology of neuronal populations at an unprecedented level.

The Reid lab video is just one development. Other new methods measure multiple neuron activity with submillisecond time resolution. At the MSRI/RNI workshop, Charles Gray of Montana State University described recent progress in developing multielectrode arrays, allowing the experimenter to observe spike trains from many neurons simultaneously. These arrays, configured as more than 50 tetrodes in a column, can be disentangled, after sophisticated signal processing, into dozens of spike trains associated with individual neurons near the electrode array. The spike trains, in turn, exhibit correlations between each other and with the visual stimuli. So researchers are now able to witness the "passing of information" among neuronal collectives.

Another exciting development has been the use of 100-electrode array implants in paralyzed human subjects. Work by John Donoghue at Brown University and at Cyberkinetics in Foxborough, Mass., has been using this implant as a neural prosthesis, enabling subjects to command a computer by merely thinking about the desired effect.5 Mathematical techniques, such as the Kalman filter, are used to interpret the recorded spike trains, and Donoghue's collaborators, including mathematicians and computer scientists, refine these algorithms. The datasets from all these high-bandwidth recordings need sophisticated analyses in order to tease out their meanings, and this is one important role for statisticians and mathematicians.

Moreover, the ongoing data revolution is causing a theory revolution, an increasingly sophisticated approach to learning models from real data. Instead of proposing simplistic equations to model the process of thought, theorists now use massive existing databases, satisfying no known equations, as sophisticated models of what the brain deals with and how it processes information. At the MSRI/RNI workshop, mathematicians discussed several such new data-driven models. Extrapolating only slightly, Carnegie Mellon University's Michael Lewicki showed how the hierarchical structures in visual patterns could be learned automatically from visual data, by statistical learning algorithms.6 Going further, Stuart Geman of Brown University presented a model in which partial synchrony of neural firing could create grammatical groupings, as in the syntax of language. At what one might call the high end of theorizing, Jeff Hawkins, the director of the RNI, presented a general framework for the computations of the brain, entitled "How the Cortex Works."7 It may well be possible soon, even for models as ambitious as Hawkins', to devise experimental tests for their key ideas.


Awareness of opportunities of high-throughput data is now driving both new theory and new experiment. This gives new common ground between theory and experiment; the way is ready for fruitful exchanges, erasing the past history of division and indifference. We should now vigorously encourage such exchanges. But this requires effort.

Funding agencies such as the National Institutes of Health and the National Science Foundation can encourage such interactions by:

• Demanding interdisciplinary data-analysis teams explicitly requiring statisticians, computer scientists, electrical engineers, and applied mathematicians to be involved in the data analysis on any sponsored work, the same way they currently require statisticians to be involved in the design and analysis of clinical trials

• Explicitly supporting theoretical modeling efforts related to new high-throughput data sources

• Encouraging experimentalists to open their labs to cross-disciplinary work

• Creating publicly-available data sources similar to efforts by the Human Genome project for genomics, the Sloan Digital Sky Survey for astronomy, and the Digital Human project for anatomy

• Creating alternatives to the study section/continuing grant models that would discourage out-of-the-box, entrepreneurial thinking

Other organizations, particularly scientific societies and publicly funded research institutes, can further the exchange of ideas by literally bringing members of different communities together. MSRI itself has long been in the forefront of holding workshops to bring together mathematicians with scientists in other areas to facilitate these interactions. Other organizations in mathematics and science generally should follow their lead and foster further exchange.

The common denominator here is the disruptive influence of new massive datasets, with detail and density never before available. The data are becoming rich enough and complicated enough for mathematical sophistication to become essential. Funding agencies and research institutes should push hard to promote sophisticated methods, open-source datasets, and close collaboration.

Maybe we'll soon see things even Sherrington never imagined.

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!