© ROLEX AWARDS/JACQUES BELAT (IN J.MEYER, WHISTLED LANGUAGES. SPRINGER 2015)
Onur Güntürkün left his native Turkey decades before he learned of whistled Turkish—a means of communication practiced in a remote region of the country. Güntürkün, a cognitive neuroscientist at Ruhr University Bochum in Germany, was on sabbatical in Australia a few years ago when a colleague there mentioned that he had visited these villages where whistling is used to converse over long distances. “From the first moment, it was clear to me this was what I needed to conduct critical experiments,” he says.
Güntürkün studies the asymmetries that exist in the brain, and the dogma in his field, he explains, has been that the left hemisphere is dominant in processing language. The right hemisphere plays a smaller role, primarily involving the interpretation of the prosodic aspect...
“We assumed that the hemispheric asymmetry that we have in language processing is crafted in such a way that the left hemisphere doesn’t care about the physical structure of language,” he says. So whether it’s a tonal language that relies on pitch, one that employs clicks, or sign language, the brain still knows it’s language and the left hemisphere carries the processing load.
Whistled language, however, is something altogether different, Güntürkün says. The sounds of whistled Turkish recapitulate spoken syllables as closely as possible, but cannot duplicate every aspect of speech; the slower, more melody-like changes in acoustic signal characteristic of the whistled language are just what the right hemisphere is primed to process. “Therefore, whistled language is a perfect experiment of nature. It’s a full-blown language but delivered with a physical structure for which the right hemisphere is dominant.” Güntürkün wanted to see whether the asymmetric processing of language would hold up among whistled-Turkish speakers, so he and his wife headed for the hills of Kusköy in the northeast of the country.
COURTESY OF ONUR GNTURKUNThey made friends with locals who helped recruit whistled-Turkish speakers for the study. Because the Güntürküns visited during Ramadan, the cafés in town made for quiet daytime places to conduct experiments. The setup they used is a tried-and-true method of assessing lateralization: through headphones each ear receives a different sound simultaneously (for example, bah in one ear and tah in the other), and the listener is asked to say which one he perceived. If the sound perceived came through the right ear, that would mean the left hemisphere dominated the sound processing, and vice versa.
When study participants were played syllables of spoken Turkish, the brain lateralization was as expected: most of the time, the volunteers perceived the sound that came into the right ear. But when they heard bits of whistled Turkish, the asymmetry disappeared—neither hemisphere seemed to dominate (Current Biology, 25:R706–08, 2015). “There is a certain physical form of language, in this case whistles, where the dominance of the left hemisphere is broken,” Güntürkün says.
Martin Meyer, who studies speech processing at ETH Zürich, cautioned against overinterpreting the results. For one, he asks, are short notes of whistled Turkish actually perceived as syllables in the same way one would discern discrete bits of speech? “This is not mentioned in the paper, whether there is a tight correspondence between syllables in spoken language and syllables in whistled language.” If not, then the symmetry observed is what one would expect to see if a person simply listened to whistled notes.
Julien Meyer (no relation to Martin), a researcher at CNRS in France who has spent years documenting whistled languages, says his own work shows that such languages (there are quite a few in addition to Turkish) do indeed reflect the phonology—the basic sound elements—of spoken language. “Some people try to say it’s a simpler language,” Meyer says. But the phonological references “you have in your brain . . . these are the same references.”
However, Julien Meyer says he’d like to temper the idea that whistled languages are fully perceived as slow modifications in sound. In a study of whistled Turkish, he found that speakers are picking up on formants, characteristic acoustic properties of speech sounds in the mouth. Formants, he says, are rapid parts of speech analyzed by the brain. “Whistled consonant-vowel transitions are as quick as spoken ones.”
One way to better assess what participants perceive, suggests Martin Meyer, is to play them longer segments of speech, say, phrases, and record brain activity with an electroencephalogram (EEG) recorder or fMRI. Güntürkün says he had to make do with a low-tech solution given Kusköy’s rugged terrain and remoteness. Such geographical features are typical of regions where whistled languages are often employed to communicate over long distances. In his book, Whistled Languages, Julien Meyer lists those used around the world: from Kickapoo and Siberian Yupik in North America to Punan Busang in Malaysia to a handful of languages in Brazil and two dozen whistled languages in Africa.
Not surprisingly, whistled languages are threatened by the ubiquity of cell phones. Güntürkün says the number of whistled-Turkish speakers has dropped considerably in the last few decades, and just about no women use it anymore. Martin Meyer says such “exotic languages” are valuable tools in learning more about how the human brain processes language, a field of study that has focused on English, German, and other common languages. “The diversity of language worldwide is really massive,” he says. “But there’s no comparable diversity of brains. The hardware is more or less the same. So the brains of humans all over the world are able to master all those diverse languages. [How they do] is one of the really interesting questions.”