ABOVE: A fruit fly tracked using the LEAP tool
MODIFIED FROM MURTHY AND SHAEVITZ LABS, PRINCETON UNIVERSITY

It takes an average of 17 minutes for a fruit fly couple to move from meeting to mating, says Talmo Pereira, a PhD student studying neuroscience in Joshua Shaevitz’s and Mala Murthy’s labs at Princeton University. The encounter is marked by “lots of complex stages, arguably more complex than human courtship,” he says. A male and a female Drosophila melanogaster first size each other up through an exchange of pheromones. If they’re compatible, the male chases the female down and woos her by “singing” with a wing that he vibrates in particular patterns to form the notes of his ballad. Then the partners dance, running and circling each other. Finally, the male attempts to copulate and the female accepts or rejects.

Pereira is studying how the courtship song and dance are...

Traditionally, researchers have collected data on animal movements by clicking through video footage, frame by frame, and labeling the body parts of interest. It’s a laborious process that can take graduate students or volunteers hours upon hours, says Ben de Bivort, a behavioral neuroscientist at Harvard who was not part of the study. And such herculean efforts produce small datasets compared to other methods that researchers employ in their studies of animal biology, such as genomics or high-resolution neural recording, he says. “So measurement of the behavior was always kind of a bottleneck.”

We developed all this crazy artificial intelli­gence just to try to understand fly sex.

—Talmo Pereira, Princeton University

Another option is to glue markers onto an animal’s limbs and then use computer software to track them from video footage. “Imagine like the markers you would put on Andy Serkis to make Gollum in Lord of the Rings,” says Gordon Berman, a theoretical biophysicist at Emory University who did a postdoc with Shaevitz but was not involved in the LEAP project. Unfortunately, animals are pretty good at grooming them off, and “putting these markers on a fly . . . is rather difficult,” he says.

Pereira says that watching actors in motion capture suits was in fact what got him thinking about how to track the flies. But as he dug into the literature, he realized that scientists were already starting to capture animal motion without using markers. Aldarondo, meanwhile, was working on motion capture algorithms in a computer science class, and, after chatting about it with Pereira, he decided to apply neural networks to his lab’s footage of individual fruit flies as a course project.

In the first attempt, Aldarondo and another student in the course labeled thousands of frames of video with points denoting fly body part locations and then used those frames to train the network to recognize the body parts automatically. After the course ended, he and Pereira continued working on the project, tweaking the algorithm to automate more of the process. At the end of last year, they published a version of the tool that needs far fewer frames—around 100—to achieve up to 95 percent accuracy in tracking 32 points on a fly’s body. In their report, the researchers used LEAP to track all six of a fly’s legs, plus its wings, body, and head. They also applied their tool to capture the limb movements of a mouse (Nat Methods 16:117–25, 2019).

LEAP’s success comes from a combination of human and artificial input. After receiving a set of labeled video frames, it uses them to learn how points are placed according to each image’s features, and then spits out labels for the next set of frames, which a researcher then reviews. The tool’s guesses may not be great the first time, but correcting the program helps it get smarter. After a few rounds of back and forth between LEAP and a human, the program has learned enough to correctly identify the parts—even over the course of less than one day. De Bivort describes the process as “using the algorithm to produce the data to make a better algorithm.”

“It’s surprisingly easy. It’s obviated a lot of my hard-won image processing skills over the years,” says Berman, who uses the tool in his own research on flies and prairie voles. “What used to take months and months of work now takes a couple of weeks, if that.”

Another artificially intelligent method for motion capture, DeepLabCut, developed by a separate group of Harvard researchers appeared around the same time as LEAP, also for tracking mice and fruit flies. Each tool has its advantages: LEAP requires less time to train, but DeepLabCut uses a bigger neural network and performs better than LEAP on cluttered or lower-quality images, says Berman. But both have the strength of being applicable beyond the species they were first developed for. So far, multiple research groups have used LEAP to track the motion of mice, rats, grasshoppers, spiders, ants, fish, and more, says Pereira.

Both tools could have applications in everything from behavioral ecology to medical research, where they could help study disorders such as autism that are associated with stereotyped movements, says de Bivort. They’ll also help neuroscientists probe the connections between the brain and behavior, he adds. “Maybe the biggest question in neuroscience is: How does the brain produce behavior? Because that’s what the brain is for,” de Bivort says. “It’s not exaggerating to say that these tools are a big deal for our field now.”

Clarification (May 3): The first line of this article has been updated to reflect the fact that Talmo Pereira is a student in Mala Murthy’s lab as well as Joshua Shaevitz’s lab. 

Interested in reading more?

May 2019 The Scientist Issue

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!