Wandering through a maze with striped gray walls, a mouse searches for turns that will take it to a thirst-quenching reward. Although the maze seems real to the mouse, it is, in fact, a virtual world. Virtual reality (VR) has become a valuable tool to study brains and behaviors because researchers can precisely control sensory cues, correlating nerve-cell activity with specific actions. “It allows experiments that are not possible using real-world approaches,” neurobiologist Christopher Harvey of Harvard Medical School and colleagues wrote in 2016 in a commentary in Nature (533:324–25).
Studies of navigation are perfect examples. Extraneous sounds, smells, tastes, and textures, along with internal information about balance and spatial orientation, combine with visual cues to help a mouse move through a maze. In a virtual environment, researchers can add or remove any of these sensory inputs to see how each affects nerve-cell firing and the neural patterns that underlie exploration and other behaviors.
But there’s a catch. Many VR setups severely restrict how animals move, which can change nerve cells’ responses to sensory cues. As a result, some researchers have begun to build experimental setups that allow animals to move more freely in their virtual environments, while others have starting using robots to aid animals in navigation or to simulate interactions with others of their kind. Here, The Scientist explores recent efforts in both arenas, which aim to develop a more realistic sense of how the brain interprets reality.
FLY ON A TETHER
Researcher: Mark Frye, neurobiologist, University of California, Los Angeles
VR Set up: Magnetic tether
Vertical bars, small “boxes,” and landscapes of moving vertical lines may seem trivial, but in a fly’s world they represent aspects of the landscape such as trees (bars), predators (box), and being blown off course (lines). “We are interested in understanding how visual systems distinguish these sorts of features,” says Frye. “Our own brain does the same sorts of things, but we don’t have a clear understanding of how, on the molecular and single-cell level.”
Frye and colleagues developed a tether system that lets animals take flight in a virtual environment. The researchers glue the dorsal thorax (just behind the head between the two wings) of a fly to a small pin and place the pin and attached fly into a magnetic field, so the insect can move vertically. They then let the fly move about the arena ringed by projectors.
What it takes: A few inexpensive rare-earth magnets ($10 each; see list at Frye lab website), a miniature v-shape pivot bearing, and a steel pin. You also need a video camera and computer to track the fly’s body angle and LED panels to generate the visual stimuli that are displayed. “Flies can see faster than humans, detecting the flickering of our standard computer monitors, so we have to use something faster to display movies to them,” Frye says. The LEDs come in small 8x8-pixel panels, connected like Legos ($30 each, total cost ~$1,500, IORodeo). The visual display that the fly sees in full panorama is 96x32 pixels. That seems really low resolution to us, but flies also have poor spatial resolution, so to them, these displays seem like high-definition television, Frye says.
What you can learn: Frye and a colleague recently used the magnetic tether to study flies’ saccades—very fast jumps from one eye position to another (Curr Biol, 27:2901-14.e2, 2017). Decades of work had shown that when rigidly fixed, flies track a projection of a bar with smooth eye movements. But the new setup showed the opposite. Flies demonstrated sustained bouts of saccades following the bar, with surprisingly little smooth movement. In contrast, the insects’ eyes moved smoothly while seeing a projection of a panoramic scene. “What blew my mind was the fact that the bar stimulus is not processed by the smooth panorama system at all,” Frye says. “The really interesting implication here is that rigidly fixing a fly in place in virtual reality somehow disrupts visual processing in a systematic way.”
Researcher: York Winter, cognitive neurobiologist, Humboldt University, Berlin
VR Set up: Virtual Reality ServoBall
The way rodents’ heads are fixed in common VR setups dramatically restricts how they act, so complex behaviors such as spatial orientation, which require head movement, are impossible to elicit, according to Winter. Such restriction is especially stressful for rats, and dangerous because rats, compared to mice, are strong enough to hurt someone trying to restrain them. As an alternative to head-fixed VR treadmills, Winter and colleagues developed the Virtual Reality ServoBall (J Neurophysiol, 117:1736-48, 2017).
Rats walk from their home cage into the VR environment through a radio-frequency identification (RFID)–controlled gate system. Because the rats can enter the VR arena any time, training them is relatively quick and easy, even for cognitively complex VR experiments, the team notes.
What it takes: A home cage attached to a tunnel into the experimental arena with the ServoBall, a spherical treadmill system ($94, Phenosys). The arena contains a 490-millimeter platform within a transparent cylinder that limits the movement of the animal to the central part of a 600-millimeter-diameter ball. The treadmill is surrounded by a circle of monitors that display the visual environment from the animal’s position in the VR scene. Video cameras track animal movement, providing feedback in a closed loop that can alter the movement of the ball, keeping the animal in the center of the arena. There are also eight retractable liquid reward devices located at the periphery, which permit the delivery of a water reinforcement at experimentally predetermined locations.
What you can learn: Because the ServoBall can stop and start to give the animal more autonomy in its exploration, the rat receives touch information from the physical walls of the arena as well as balance and other information when its body rotates. And, because no strength is needed to move the motor-driven ball, it can also be used with mice, certain species of lemurs or birds, and even insects, the team notes. This setup can also be used to study the neuronal activity underlying free exploration by combing the ServoBall with optogenetic techniques or microscope headpieces designed for freely moving animals.
INTO THE “CAVE”
Researcher: Anton Sirota, neuroscientist, Ludwig Maximilian University of Munich
VR Set up: ratCAVE-VR
Spherical treadmills are great for presenting precise stimuli to animals, says neuroscientist Andrew Straw at the University of Freiburg in Germany. The downside is that the animal receives sensory feedback that is unnatural. “This can be particularly problematic when studying spatial awareness and spatial cognition,” Straw says. “If the animal doesn’t feel like it is moving correctly, it may try to correct the situation rather than behaving as it would in more natural conditions.”
To move beyond this limitation, teams of scientists, including Straw, are developing the eCAVE Automatic Virtual Environment setup, in which animals move freely within a cube. Initially developed for flies (J Exp Biol, 212:1120-30, 2009), the technology has since been adapted for fish, mice, and, most recently, rats (bioRxiv, doi:10.1101/161232, 2017).
In the ratCAVE-VR experiments, rats gain visual feedback in 3-D space, and, in turn, interact with and follow the virtual walls, explore virtual objects, and avoid virtual cliffs—much more naturalistic behaviors, note Sirota and colleagues in a paper describing the setup.
What it takes: The testing area is a rectangular arena similar to that used for regular open-field experiments. However, in this configuration, the arena is painted white and serves as a projection surface. Sirota used an array of 12 high-speed cameras ($2,499-$3,499 for the array, OptiTrack or NaturalPoint Inc.) to track the position of the rodent’s head, which was decorated with reflective dots, in 3-D space. This tracking system enabled the team to update the rodent’s head position with very high resolution. To map the virtual environment onto the projection surface, the team used an algorithm identical to those described in other rodent VR setups; in this case, the projection was continuously updated according to the changing 3-D position of the rodent’s head.
What you can learn: In the arena, the walls are shifted to appear in a different location from their physical location. The animals are tricked into believing the VR stimulus. After experiencing the shifted VR environment and a normal environment, the animals are no longer fooled by the shift. Straw says the animals probably could feel the walls with their whiskers to discern the mismatch. “I think this demonstrates how powerful physical cues are for knowing where you are,” he says.
FROM VIRTUAL REALITY TO ROBOTS
Researcher: Jean-Marc Fellous, psychologist, University of Arizona
Set up: Sphero robot
Even if animals are moving freely, virtual environment constraints may have significant consequences on how the neural circuitry underlying spatial navigation works. As an alternative, Fellous and his colleagues are having rats interact with robots to track how the animals’ brain activity correlates with behavior. The team developed a braking algorithm for the robot, so the researchers could precisely control the rodents’ direction and speed, and, as a result, used the robot to lead rats through the correct path in a complex maze with nine possible reward sites (J Neurosci Methods, 294:40-50, 2018).
What it takes: The robot, called Sphero 2.0 ($130, Sphero) is a small ball, which Fellous harnesses to a chariot-like contraption. Between the wheels of the chariot is a small tray carrying rat treats, which help animals learn to follow the robot. The braking algorithm is used to make the robot stop at precise locations and travel at an exact speed.
What you can learn: Fellous and his colleagues collected electrophysiological recordings from robot-guided rats comparable to those obtained with VR experiments. They showed that place-cell firing in the hippocampus is the same whether rats learn the maze themselves or are taught by the robot, so researchers could use robots instead of VR to study the neural activity underlying spatial navigation.
Straw notes, however, that while robots are an exciting addition to the tool chest, they too have drawbacks. “Using a robot to lead animals around or to simulate other animals can be really important for certain experiments, but robots are bound by the laws of physics,” he says. “With virtual reality, experimental designs that make use of teleportation and other physically impossible feats become possible.” The techniques could complement each other well, he notes.