IMAGE COURTESY OF GIL MENDA
Gil Menda was bored. It was 2012, and his research on facial recognition in wasps was going nowhere. The Cornell University graduate student turned to his advisor, neurophysiologist Ron Hoy, as the professor was running out the door to teach a class. There were jumping spiders in the lab already, so Menda asked for permission to attempt the impossible: to tap into the central nervous system of an arachnid that was far more liable to depressurize and die than sit still for brain surgery. Hoy assented.
It was a problem that had vexed biologists for decades, says Paul Shamble, an arachnologist who was then a fellow Cornell graduate student. The jumping spider is unusual among arachnids, most of which have relatively...
By the time Hoy returned from teaching his class, Menda had succeeded in his efforts. Using an ultrathin metal wire, he’d gently poked a hole in the spider’s cuticle that was small enough to self-heal. A glass-insulated tungsten extracellular electrode implanted within range of six neurons in the spider’s brain registered data in the form of voltage spikes.
Shamble then helped Menda design and 3-D print harnesses for the male and female spiders. Refrigeration and a drop of wax immobilized them for study. Now that jumping spider vision was no longer a hypothetical research topic, Cornell grad students outside the Hoy lab took note. Soon, James Golden, a computational neuroscientist, and Eyal Nitzany, a biological statistician, joined the team. The foursome collaborated on experimental design and obtained the first recordings from the visual processing centers of the spiders’ tiny brains (Current Biology, 24:2580-85, 2014).
You could put a hundred jumping spiders along the frame of your glasses. If you could turn those eyes onto what your eyes were doing, you would have a gangbusters eye tracker.—Ron Hoy,
Menda recalls one day of that project in particular. He’d come into the lab early to take out the spiders and start recording neuron activity while using the visual stimuli Nitzany had developed. He showed an immobilized spider potential mates and its natural prey, flies, on a screen.
“I saw that [the spider] concentrated on only one from the variety I was showing him,” he explains. It was the prey stimulus that held the spider’s attention. The longer the spider focused, the more spikes Menda recorded. “I was like, ‘Whoa! This is recognition of the object. This is really interesting.’” Immediately, he sent out a group text, imploring fellow grad students and Hoy to drop what they were doing and come to the lab. They did—and they stayed all day, watching the live recording, developing new stimuli to test, and building computer programs that could sort and analyze the incoming data right then and there.
“I’ve had a lot of graduate students in my time,” Hoy says. “It’s usually one student, one problem, and everyone is working at their own cubicle. But in this case, it was kind of like mission control!”
The Cornell team discovered that it’s not just that the jumping spider has eight incredible eyes—it’s that it uses them together. When Golden’s algorithms sorted the thousands of recorded neural spikes into individual classes by height and shape, the researchers saw that the neurons interacted differently depending on the stimuli the spider was shown. Jumping spiders, which are roughly the size of a pencil eraser, are able to process visual information gathered by any of their eight eyes, decide what action they need to take depending on what the stimulus is, and then alter their body position and behavior accordingly.
Collaboration was key to the project’s success. “To really make a bigger impact, to see a problem from multiple perspectives, this kind of research is critical to arrive at novel insight and novel solutions,” says John Wen, a roboticist at Rensselaer Polytechnic Institute in Troy, New York, who was not involved in the study.
Meanwhile, the Cornell team’s ongoing work—using the same microelectrode to record neuron activity in other spiders, wasps, dragonflies, bumblebees, and monkeys—has important implications for Wen’s field. As technology becomes increasingly focused on high performance in small packages, the team’s research offers a visual processing solution on a precise nano- and microscale. Hoy says that for scientists studying dyslexia and autism, “to be able to monitor eye movements in miniature would be fantastic.”
The current commercial eye tracker market is dominated by slim, Nintendo Wii-like tracking bars and Google Glass–style wearables. These devices typically max out at five or six fingertip-size cameras, and can be intrusive and limited in their ability to capture data points as the subject’s head moves, which makes them difficult to use successfully when studying children and young adults with developmental disorders (J Vis Exp, doi 10.3791/3675, 2012).
But studying the jumping spider’s tiny visual processing system could lend insights that pave the way for eight cameras and a communication network far smaller than a pinky. “You could put a hundred jumping spiders along the frame of your glasses,” Hoy says. “If you could turn those eyes onto what your eyes were doing, you would have a gangbusters eye tracker.”
Nitzany offers a more macro application: subway surveillance cameras. “You have thousands of people going through the subway, and you need to know which one you need to focus on,” Nitzany explains. “You don’t want to focus on everyone. You have to know how to integrate all those signals so you only focus on the most important one. This is very much the same task as the jumping spider [does].”
It’s all part of the charisma of the jumping spider, whose specialness Hoy finds impossible to overstate. “It’s like they’ve been cast from web heaven to earth, where they have to find their own food and find mates,” he says. “People had pretty much given up on recording from them, but with good luck, a good set of hands, and a great team, we managed to crack it.”