The cephalopod’s unique ability to disguise itself relies on a single motor nerve exclusively dedicated to skin tension and papillary control.
By investigating the science behind “seeing” with sound, researchers hope to help blind individuals independently navigate the world.
October 1, 2017|
Daniel Kish, an expert echolocator, uses sound to see the world. After losing his eyesight to retinoblastoma at the age of one, he learned to navigate using the noise from his tongue clicks bouncing off nearby surfaces. Dubbed “Batman” for his abilities, Kish is able to independently bike down streets and hike through the wilderness with ease.
While some may perceive echolocation as an almost superhuman sense, it’s a surprisingly ubiquitous ability. Although the vast majority of people are unable to navigate using echolocation alone, even those without training can use this skill to sense their environments—for example, by hearing the difference between standing in a cathedral and a soundproof room. “We hear echoes all the time,” says Daniel Rowan, an audiologist at the University of Southampton in the U.K. “What blind people [like Daniel Kish] are doing is . . . putting together a range of different skills that we already have, more or less, and taking them to a level of expertise that you and I wouldn’t have.”
For people who lack sight, echolocation can be a valuable skill. Since this technique is particularly useful for detecting objects at eye level, it is typically used as an addition to—rather than a replacement for—canes and guide dogs, which are helpful for identifying things on the ground.
As people age, however, their hearing often worsens, which can impede their ability to use echoes to identify their surroundings. Rowan wants to tackle this issue by first trying to piece together what types of acoustic information people with normal hearing can use for echolocation.
To address this question, Rowan and colleagues recently conducted a series of experiments on both blind and sighted participants who were inexperienced in echolocation. The trial subjects wore headphones through which played long sequences of clicks that Rowan calls “virtual objects.” His team created these in a soundproof room by placing items around an acoustic mannequin (a model of the human head and ears). The researchers played a sound near the mannequin, which would hit the object and bounce back to the model head where it was recorded, simulating the perception of reflected sound that an echolocator might hear.
While some may perceive echolocation as an almost superhuman sense, it’s a surprisingly ubiquitous ability.
“The advantage of doing what they did is that you have good control over the information that people are getting—that’s important because we know relatively little about the acoustic cues that people may use,” says Lore Thaler, a psychologist at Durham University in the U.K. who was not involved in the work. “[However,] I think that an important step would be to try to get from this paradigm to a paradigm where one can use a similar [method] but participants make their own mouth clicks.”
Rowan’s team discovered that inexperienced, sighted listeners could detect objects up to four meters away. However, these individuals performed poorly on tasks where high frequencies were removed from the acoustic stimuli. The researchers found similar results in their small sample of five blind participants (Hearing Research, 350:205-16, 2017).
According to Thaler, her lab and others have found that people who use higher-frequency clicks tend to perform better on echolocation tasks (PLOS ONE, 11:e0154868, 2016; J Neuro, 37:1614-27, 2017). “This is a correlation, but it fits with what they have found here, [which is] that there’s something within the higher frequency range that’s informative,” she says. “When we teach these mouth clicks, knowing that people who make brighter clicks tend to perform better, we emphasize that people make a click that’s relatively brief and bright.”
“[This study] is quite nice because it [highlights] the importance of high-frequency listening,” says Andrew Kolarik, a research associate studying echolocation at the University of Cambridge who was not involved in the work. “Unfortunately, this seems to be the frequency that’s the first to go as people get older and they start to lose their hearing.”
Kolarik adds that the finding that individuals can use echoes to identify objects up to four meters away “makes us think that [this] is the possible level to aspire to, and that maybe we can train people or change the environment to try to make it easier to use echolocation.” He points out that previous experiments, including his own, have found that effective echolocation is only possible at much shorter distances—around two meters at most (PLOS ONE, 12: e0175750, 2017).
More studies are necessary to tease apart what types of useful information high frequencies provide. However, researchers are now starting to think about ways to help blind individuals with compromised hearing.
According to Rowan, one potential approach will be to develop technologies to convert those high frequencies to lower frequencies. He adds that most conventional hearing aids are unlikely to fulfill this need because they don’t usually receive frequencies beyond the 3,000 Hz typically required for echolocation. Even those hearing aids with extended bandwidth, between 8,000 to 10,000 Hz, may be limited, as recent evidence (PLOS ONE, 13:e1005670, 2017) suggests there is useful information to echolocators beyond those frequencies as well.
“We need an improvement in technology in order to give blind people access to the information that’s above the current reach of hearing aids,” Rowan says. “We all lose our high-frequency hearing as we get older. So that’s a problem that health-care services need to tackle for blind people.”