FLICKR, LOVELORN POETSLanguage helps the human brain perceive obscured objects, according to a study published today (August 12) in the Proceedings of the National Academy of Sciences. While some scientists have argued that vision is independent from outside factors, such as sounds or the brain’s accumulated knowledge, the study indicates that language influences perception at its most basic level.
“What the data shows is that language comprehension does benefit visual processing, when the language you hear concords with what you're looking at or trying to see,” Lotte Meteyard, who researches language processing at the University of Reading in the U.K. but was not involved in the study, told The Scientist in an e-mail.
“I think [the study] makes a really important contribution to the field of visual perception and cognition in general,” said Michael Spivey, a cognitive scientist at University of California, Merced, who also did not participate in the research.
Psychology professor Gary Lupyan of the University of Wisconsin-Madison and a colleague tested the effects of language on perception by either saying or not saying a word and showing study participants either an obscured image that matched the word, an obscured image that did not match the word, or no image at all. Images ranged from kangaroos to bananas to laundry baskets. The researchers then asked the participants if they had perceived any objects and, if so, ascertained what they had seen.
The researchers used a technique called continuous flash suppression, in which study participants wear glasses that show them either the image or no image in one eye while displaying a flashing tangle of curved lines in the other.
The participants were more likely to perceive an image if they had been given an accurate verbal cue first than if they had been given no cue or an incorrect one. With an accurate cue, they identified the object correctly around 50 percent and 80 percent of the time on two slightly different rounds of the experiment, while falsely identifying objects much less frequently, around 15 percent and 4 percent of the time, respectively. The researchers allowed the participants to look at the images for longer the second round, likely leading to the higher accuracy rate.
Next, the researchers performed a similar experiment using squares and circles, as well as other forms somewhere on the continuum between the two shapes. Hearing “square” or “circle” beforehand helped the participants effectively detect an image, but only when the shape was reasonably congruent with the word.
Previous studies have shown connections between vision and other sensory inputs, said Spivey, but Lupyan’s experiment stands out because it provides evidence that language boosts perception at one of the very earliest stages in the brain’s vision circuit.
Continuous flash suppression has previously been shown to prevent the brain from perceiving objects at all, while other methods of obscuring images appear to allow the brain to perceive them subconsciously. Therefore, previous studies on vision that have used other approaches have only shown that outside inputs help the brain bring objects into consciousness, but not necessarily that outside inputs can dictate whether or not an object is seen at all.
Lupyan confirmed that continuous flash suppression was restricting perception at one of its earliest stages by testing the participants’ ability to see afterimages—lingering optical illusions of objects that were shown to them and then removed. The participants reported lighter afterimages following trials that used continuous flash suppression to suppress the images. Since afterimages are thought to be generated through retinal and early visual processing of images, not through higher cognition, the results indicated that continuous flash suppression acts at an early stage of vision. By using continuous flash suppression, Lupyan has “done the best job yet of showing where the interaction happening is in perception,” Spivey said.
Moreover, Lupyan said that his work could help researchers discern whether people who speak different languages perceive the world differently. For instance, if two people spoke different languages that either did or did not have words for a certain color or texture, the person lacking language to describe the color or texture might be less likely to perceive it.
Spivey said that Lupyan’s findings are part of a larger trend in cognitive science. While early on, researchers studied brain processes as separate modules, they are beginning to realize that parts of the brain feed are interconnected. “There are findings like this all over the place, in motor processing, language, and memory,” he said. “More and more what the field is finding is that any cognitive or perceptual capacity you find interesting is probably richly connected with other ones.”
“The visual system—and perception in general—uses all the information it can get to make sense of the inputs,” said Lupyan. “Vision is not just about the photons hitting the eye.”
G. Lupyan, E. Ward, “Language can boost otherwise unseen objects into visual awareness,” PNAS, doi: 10.1073/pnas.1303312110, 2013.