COURTESY OF EHUD AHISSAR
While it’s generally easy for humans to describe the color of an object or the feel of a fabric, neurobiologists have struggled to articulate the psychophysical mechanics of perception. Movement is nearly always part of sensing the world around us—we scan a room with our eyes or step back from a painting to get a different perspective; we tilt our heads to sniff; we lean in for a better listen; we stroke our babies’ heads to learn their contours. A central question, however, has been whether these physical movements are influenced by what they perceive. In other words, does perception shape our information-seeking activity or do we rely on predetermined data-gathering motions?
In vision, at least on a superficial level, it appears obvious that the movement of the eyes and the images they perceive can have an effect on one another. Michele Rucci, the director of the Active Perception Laboratory at Boston University, offers this example: when you walk into a room, you select the location you want to examine based on the visual information you’ve gathered—your perception altered your behavior, and in response, “this behavior is now instructing vision,” he says.
If [perception] starts anywhere, it starts with the motor movement and not with the sensory reception.—Ehud Ahissar, Weizmann Institute of Science,
“For touch, the assumption is that the processes leading to perception are only sensory processes,” says Ehud Ahissar, a professor at the Weizmann Institute of Science in Rehovot, Israel. That is, sensations of touch don’t, according to the conventional wisdom, feed back and influence our motor movements that are involved in gathering tactile information. But there isn’t much evidence to back up this assumption. Ahissar and his colleagues designed a simple experiment to test whether motor movements are influenced by what the body feels: they equipped humans with whisker-like appendages. The “whiskers” were attached to the fingertips of eight adults who were blindfolded and earplugged. Ahissar’s group asked the subjects to determine which of two nearby poles was closer.
The researchers found that all the participants chose the same strategy, says Ahissar: “They tried to keep their hands as coordinated as possible and use a temporal cue to judge [distance].” The people would move both hands at the same time, and the first hand to touch a pole with its whisker indicated that that pole was closer. In repeated trials, the difference between the poles’ distances was gradually shrunk to measure each subject’s tactile acuity.
On a subsequent day, the subjects got better at judging the poles’ distances—not in how quickly they responded, but in their acuity. Whereas on day 1 they could accurately pick out the closer pole if the distances were set apart by about 8 cm, on day 2 they could feel a difference between the placement of the poles separated by only a little more than 3 cm. It turned out that the improvement in acuity was due to a change in behavior: the subjects had slowed down their hand movements as the task became more difficult, lengthening the delay between contact with the closer and more distant poles and thus enabling them to perceive a smaller spatial difference. The experience of previous trials had enabled them to refine their strategy.
“It does suggest that active movement is crucial in the way we integrate sensory information,” says Hillel Chiel, a professor at Case Western Reserve University, who was not part of the study. Chiel says he’s not surprised by the results, but is gratified to see them come out of a controlled experiment. “The reality is, any organism that depends on a sense almost always uses active movement to sense,” he says. Take, for instance, a bloodhound, which moves its nose—if not its whole body—around a scent, or an owl that turns its head to hear.
Ahissar says he can imagine ways in which the whisker-based sense, adopted so quickly by the humans in the experiments, might be helpful to people with visual impairments. Sensory substitution devices convert images to tactile sensations, allowing blind people to feel what others can see. But Ahissar says these devices have often relied on the assumption that moving the body would not enhance perception, and are designed so that the user just passively receives the information from the machine. If, instead, visually impaired people could engage their motor movements with the tactile stimulation, perhaps they could learn faster, and it would be a richer and more user-friendly experience.
To develop such a device would take some technical ingenuity (“it’s now a little bit of a fantasy,” says Ahissar) and an understanding that movement is just as important in perception as is sensation itself. The view that perception starts with a stimulus impinging upon a sensory organ “is totally wrong,” he says. “If it starts anywhere, it starts with the motor movement and not with the sensory reception.”