ALFONSO PEREZ-ESCUDERO AND SARA ARGANDA
Mackerel shoaling in silvery spheres, flocks of blackbirds billowing like dark clouds, and ant colonies carpeting forest floors—nature boasts some spectacular examples of individual animals coming together to form coordinated hordes. The question of how they accomplish such collective behavior has occupied biologists for decades. But although the majesty of swarms is clear for all to see, the mechanisms that explain how starlings coordinate their speed and direction, say, or how honeybees decide where to make a new hive are far too subtle to be detected by the naked eye. (See “Crowd Control,” The Scientist, July 2013.)
“We need to see the fine-scale trajectories of every individual in a group at the same time so we can know precisely where they’re moving with respect to everybody...
One option is to physically mark the animals and track their movements using video footage. But attaching labels is labor intensive for researchers and may disrupt the natural behaviors of the target animals. Tracking unmarked animals is a better bet, but existing automated image-tracking software has its limitations. When two individuals cross paths, for example, the software works out which is which by calculating the most likely identity based on their trajectories before the two animals overlapped. Sometimes it mixes up the animals, and the errors are propagated across the rest of video, which means researchers, usually grad students, have to spend hours painstakingly checking each crossing incident by eye.
Not any more. Now, a team of researchers at the Cajal Institute in Madrid, led by Gonzalo de Polavieja, has launched idTracker—an image-tracking program that maintains the correct identities of hundreds of individuals in a video with almost 100 percent accuracy, regardless of how similar they look and how many times they cross each other’s paths.
“It’s something that a lot of people wanted to be able to do, but they were the first to come up with a method that actually works,” said Simon Garnier, who runs the Swarm Lab at the New Jersey Institute of Technology in Newark. “It will ease the tedium burden and make it easier for us tease apart swarm intelligence.”
The study of animal behavior has been stuck in traditional methodology for far too long and is stagnating as a result. This type of technology will revitalize this field.—Iain Couzin, Princeton University
Princeton University’s Iain Couzin, who studies collective animal behavior, is even more impressed. “I was frankly stunned to see such a brilliant solution to this long-lasting problem,” he wrote in an e-mail to The Scientist. “Previous methods did not work. At all. This method works near-flawlessly.”
De Polavieja’s team didn’t originally set out to make a tracking system. Back in 2008 the group wanted to create software that could distinguish between identical-looking fish in video footage. After a couple of years, though, the recognition software worked so well that the researchers realized it was capable of recognizing many individuals over the course of a video. “That’s when we knew we could make a tracking system,” says de Polavieja, who had noticed while reading the literature on collective behavior that there was nothing out there like what they had in mind. De Polavieja and graduate student Alfonso Pérez-Escudero conceived the program, and Pérez-Escudero wrote the software.
After several years’ tweaking and polishing, de Polavieja’s team described idTracker in Nature Methods earlier this year (11:743-48, 2014). In that paper, the researchers explain how the program extracts a unique visual fingerprint for every individual—a signature that humans cannot see. Analyzing short segments of footage for each animal in isolation, the software compares differences in gray-scale intensity and distance between hundreds of pairs of pixels to generate a set of data points that is unique to that particular individual. That signature can then be recognized and tracked regardless of the animal’s position or posture.
The fingerprints are used as references to identify individuals in each frame of video, so the correct identities are kept even when animals cross over and over again. “Even if the system makes some mistakes, they will not propagate because you’re constantly identifying each individual in each frame,” says de Polavieja. “If it’s incorrect in one frame, it can be corrected in the next frame.”
Finally, the software stitches together the tracks of each individual to produce a multicolored map of the movements of every animal in the group. It works indefinitely, so researchers can study collectives over long periods. And it reidentifies individuals when they’re put into different groups, meaning it should help reveal how individual differences contribute to collective behavior.
“What’s fantastic is the level of precision and accuracy you get,” says Garnier. Indeed, when de Polavieja and his colleagues tested their software on 23 videos of five different species—including mice, fruit flies, zebrafish, and ants—it achieved 99.7 percent accuracy on average. “Now, for the first time, there is no need for graduate students to go back and check footage frame by frame,” says de Polavieja.
Better still, idTracker is free to download for noncommercial purposes (www.idtracker.es) and, according to Andrew King, is “pretty easy to use.” It’s also open-source, meaning code-savvy researchers are free to alter it to suit their specific requirements.
Garnier plans to use the new software to study how ants organize themselves to locate food sources or new nest sites. King wants to apply it to explore how fish with different personalities or experiences can affect group dynamics. “I can now mix up shoals, and the system will subsequently reidentify individuals I’ve already been working with,” he says. “That was much harder to do before, so it’s going to be really useful for us.”
Couzin concurs: “The study of animal behavior has been stuck in traditional methodology for far too long and is stagnating as a result. This type of technology will revitalize this field.”