Flickr, grantlairdjrThe brains of deaf people reorganize not only to compensate for the loss of hearing, but also to process language from visual stimuli—sign language, according to a study published today (February 12) in Nature Communications. Despite this reorganization for interpreting visual language, however, language processing is still completed in the same brain region.
“The new paper really dissected the difference between hand movements being a visual stimulus, and cognitive components of language,” said Alex Meredith, a neurobiologist at Virginia Commonwealth University, who was not involved in the study.
The brain devotes different areas to interpreting various sensory stimuli, such as visual or auditory. When one sense is lost, the brain compensates by adapting to other stimuli, explained study author Velia Cardin of University College London and Linköping University in Sweden. In deaf people, for example, “the part of the brain that before was doing audition adapts to be doing something else, which is vision and somatosensation,” she said. However, deaf humans “don’t just have sensory deprivation,” she added—they also have to learn to process a visual, rather than oral, language.
To untangle brain changes due to loss of auditory input from adaptations prompted by vision-based language, the researchers used functional MRI to look at brain activation in three groups of people: deaf people who communicate through sign language, deaf people who read lips but don’t understand sign language, and hearing people with no sign language experience. The researchers showed the three groups videos of sign language and videos that held no linguistic content. The signing videos were designed to allow Cardin’s team to pinpoint which areas had reorganized to process vision-based language, as these areas would only activate in deaf signers. In contrast, the language-free videos would allow the researchers to identify areas in deaf brains that had adapted to the loss of auditory input, as these brain areas would activate in both deaf groups, but not in the brains of hearing volunteers.
The researchers found differences in the activity of the superior temporal cortex (STC), an area of the brain that arches over and behind the ears. The right STC, which processes auditory stimuli, responded differently in deaf and hearing people as they watched the language-free videos—indicating the loss of auditory sensation prompted the brain to reorganize in this area to respond to visual stimuli. In contrast, however, only deaf signers showed differences in brain activity in the left side of the STC, which contains the primary auditory cortex and is known to process language, in response to sign language videos. This suggested that despite the loss of hearing, these people were still processing language in this location. “The brain keeps this [language-processing] function, but uses a different type of information: visual,” Cardin said.
The findings support animal work by Meredith and his collaborator, neuroscientist Stephen Lomber at the University of Western Ontario, showing that brain areas attuned to pinpointing the location of auditory stimuli in hearing cats are used by deaf cats to localize visual inputs instead. The fact that brain areas might retain their basic function while accepting new types of sensory stimuli makes sense, said Meredith. “There’s more and more evidence [that] these areas we regard as primarily auditory or primarily visual have small feedback connections in normal people from other sense modalities,” he said.
It’s still unclear, however, how sensory deprivation and sign language experience effect different neurological changes, but the research provides the first direct evidence that changes due to learning sign language are different than those “due to deafness per se,” said neuropsychologist Karen Dobkins at the University of California, San Diego, who was not involved in the research. “The punch line is that the brains of deaf signers are plastic.”