Cover Story

Using Sound to Get Around

The sight of a blind person snapping her fingers, making clicking sounds with her tongue, or stomping her feet might draw stares on a street or in a subway station, but it’s this type of behavior that is opening up a vibrant new area of research in psychology.

Some vision-impaired individuals are beginning to use echolocation — the same type of navigational technique used by bats and some marine mammals — to navigate. They’re essentially learning about the objects in their environment by the echoes that bounce off of them.

Echolocation is not only a fascinating subject in its own right but also offers a suitable paradigm for studying neuroplasticity from several disciplinary perspectives. It is a technique that people (not only blind, but also sighted) can learn relatively easily and that can be used to probe how the brain deals with novel sensory information.

A General Ability

At one time, echolocation in humans was referred to as “facial vision” or “obstacle sense.” In fact, the term “echolocation” was coined by zoologist Donald Griffin only in 1944. Initially, the ability to detect obstacles without vision was considered a special skill of a few blind people. Scientists weren’t clear on how it worked — that is, on whether the ability to detect obstacles without vision was mediated by pressure waves on the skin or by sound. But a set of experiments conducted in the 1940s showed that sound and hearing were the driving aspects. A video of some of these early experiments, along with other Web resources, is available here.

Subsequent research showed that both blind and sighted people can develop the skill to avoid obstacles without vision, as long as they have normal hearing. In sum, these studies showed that the “obstacle sense” was not a mysterious skill that only some blind people possessed, but instead a general human ability.

Beyond Obstacle Detection

Initially, echolocation research focused mainly on the detection of obstacles. Yet subsequent studies progressed from using obstacle-detection tasks to measuring people’s ability to echolocate distance, direction, shape, material, motion, or size. Most studies made use of “categorical tasks,” which measured participants’ ability to identify something from a limited number of alternatives. In the 1960s, Winthrop Kellogg introduced the psychophysical method to human echolocation research, making more fine-grained measures of people’s echolocation abilities possible. Researchers have used psychophysical methods to measure people’s echolocation of location (direction and distance) and size.

Scientists led by Bo Schenkman (Royal Institute of Technology, Stockholm, Sweden) as well as by Daniel Rowan (University of Southampton, United Kingdom) also have made progress in investigating the acoustic features that may be relevant for human echolocation. Yet this research has focused on echoes from longer white-noise signals rather than mouth clicks that people make. While most studies have focused on the echolocating person as a “perceiver,” it is important to keep in mind that echolocation is an active process. For example, in daily life people move their bodies and heads while they echolocate. Studies have shown that bats “steer their sound beam” to sample the environment during echolocation; similarly, recent investigations by our own group and by neurobiologist Ludwig Wallmeier (Ludwig Maximilian University of Munich, Germany) and colleagues have emphasized that movement can be an essential component in humans’ successful echolocation.

The Sonar Emission

In early research, the sounds (i.e., sonar emissions) that people made to generate echoes were not systematically controlled and thus included talking, humming, mouth clicks, footsteps, cane-tapping sounds, and other noises. But in a study published in 2009, researchers led by Juan Antonio Martínez Rojas (University of Alcalá, Spain) analyzed the physical properties of various sounds and concluded that mouth clicks might be particularly useful for human echolocation because they are highly reproducible (i.e., the sound is quite stable across repeated emissions). Additionally, the spatial relationship between mouth and ears is fixed (as compared with that between the ears and ambient sound, footsteps, cane taps, etc.). Because of these factors, people can interpret variations in audible sound as changes in the environment rather than changes in the emitted sounds themselves. The majority of recent investigations into human echolocation have examined mouth-click-based echolocation. Clicks tend to be 3- to 15-millisecond-long transients (i.e., a short, high-amplitude sound) with peak frequencies around 6 to 8 kilohertz.

Echo Suppression

The human auditory system typically shows a phenomenon termed echo suppression in which a person’s percept, upon hearing two sounds in rapid succession, is driven by the first of the two sounds. This phenomenon also is referred to as the precedence effect. Wallmeier and colleagues have suggested that echo suppression is reduced during echolocation compared with “regular” spatial hearing. The underlying mechanisms are unclear at present.

The Neurobiology of Echolocation

To date, evidence for brain areas involved in human echolocation comes from studies using neuroimaging methods such as PET or fMRI. In 1999, neuroscientist Anne G. De Volder (Catholic University of Louvain, Belgium) and colleagues used PET to measure brain activity in blind and sighted people while they used an echolocation-based sensory-substitution device. The device — which included a pair of glasses equipped with an ultrasound speaker, two ultrasonic microphones, two earphones, and a processing unit — acquired and decoded ultrasonic echoes into audible sounds sent to the user’s earphones. The pitch of the audible sounds conveyed distance, and the sound’s binaural intensity balance conveyed direction. The researchers found that in the group of blind subjects, the processing of sound from the device was associated with an increase in brain activity in Brodmann area (BA) 17/18 (i.e., the early “visual” cortex). Though subjects in the study did not echolocate per se, this was the first evidence to suggest that information derived from echolocation may drive early-visual-cortex activity in blind people.

Encouraged by these findings, my colleagues and I in 2011 conducted the first-ever study to measure brain activity during echolocation in two blind people trained in echolocation using mouth clicks. Using fMRI, we found that while listening to echolocation sounds as compared with control sounds, both participants showed significant increases of brain activity in BA17. In this and subsequent studies, we also found that echo-related activity in BA17 is stronger for echoes coming from contralateral space (i.e., contralateral preference) than for other echoes, and that the activity pattern changes as the echoes move away from the center toward the periphery of space (i.e., modulation with eccentricity). A recent fMRI study by Wallmeier and colleagues has since confirmed the involvement of BA17 in echolocation in the blind. My colleagues and I also have found that echo motion (i.e., motion perceived via echolocation) activates brain areas that might coincide with the brain’s visual motion area MT+ and that the shape of echolocated surfaces might activate the lateral occipital complex (LOC), a brain area thought to be involved in the visual processing of shape. We also have found that both blind and sighted people show activation in the posterior parietal cortex during echolocation of path direction for walking, and the location of this activation might overlap with areas involved in the processing of vision for motor action.

In sum, although there are only a few studies to date about neural substrates of natural echolocation, it is increasingly evident that traditional “visual” brain areas are involved during echolocation in blind echolocation experts and that this activation appears to be feature specific.

Echolocation and Blindness

The literature to date suggests that blind people are more sensitive to acoustic reverberations even when they do not consciously echolocate. For example, researchers led by Chava Muchnik, a professor in the Department of Communication Disorders at Tel Aviv University, Israel, found that blind people have a better ability than sighted people to resolve two sounds occurring in rapid succession. That is, when a blind person is able to hear two sounds separated by a brief silent gap, a sighted person may only hear a merged single sound. This might be related to blind people’s improved processing of echoes. Research has shown that blind people typically outperform sighted people with respect to echolocation. Physiology researcher André Dufour (Université de Strasbourg, France) and colleagues asked both blind and sighted people to judge whether a sound-reflecting surface was located to the right or left and found that blind participants were more accurate than sighted participants. Yet there also are differences among people who are blind. For example, we have shown that blind people who have been trained in echolocation are better at determining the shape, size, or distance of objects, as compared with both blind and sighted people who have not been trained in echolocation. Furthermore, researchers led by psychological scientist Santani Teng (Massachusetts Institute of Technology) found a positive correlation between echolocation ability and age of onset of blindness, showing that people who lost vision earlier in life tend to be better at echolocation compared with those who lost their vision at older ages. In terms of brain activations, we have shown that people who are blind and trained in echolocation may recruit visual cortical areas for the processing of echoes.

On the behavioral level, researchers have shown that echolocation may “substitute” for vision. Psychological scientist Gavin Buckingham of Heriot-Watt University, Scotland, and his colleagues at the Brain and Mind Institute at Western University, Canada, have shown that echolocation may induce errors in blind people’s judgments of object weight. Specifically, their study published last year in Psychological Science showed that blind people trained in echolocation (but not blind people untrained in echolocation) experienced a “size–weight illusion” when they used echolocation to get a sense of how big objects were and then judged their weight.

Remarkably, sighted people experienced the illusion in the same way when they used their normal vision. In addition, researchers at the RBCS/Visuo-Haptic Perception Lab at the Istituto Italiano di Tecnologia, Italy, along with psychological scientist Melvyn A. Goodale (University of Western Ontario, Canada), showed that when asked to judge the relative locations of two sounds (using a spatial-bisection task), blind people who are not trained in echolocation show a deficit compared with sighted people. In contrast, blind echolocators perform equivalently to sighted people. This pattern shows that echolocation may replace vision for calibration of external auditory space in people who are blind.

These results, in combination with findings from brain imaging, suggest that echolocation may slot into the human perceptual system (in particular, the blind human brain) in a way similar to vision.

Outlook

Research into human echolocation is gaining momentum. Some recent studies conducted in the lab of Ludwig Maximilian University of Munich neurobiology professor Lutz Wiegrebe have investigated echolocation in virtual echo-acoustic space. Being in these virtual environments is similar to being in a 3-D video game except that the virtual space is echo-acoustic rather than visual. Using virtual echo-acoustic space allows researchers to investigate echolocation with greater flexibility than is available in the real world, but at the potential cost of reduced realism or processing-imposed time delays. Nonetheless, virtual acoustic space is a promising technique for the exploration of human echolocation, in particular during fMRI.

At present, we lack computational models, or even a process model of how human echolocation works. With respect to other modalities, such as visual perception, the development of models has led to considerable advances in scientific understanding. An important challenge for future research will be to formulate quantitative models of human echolocation.

In the last 15 years, scientists from a variety of disciplines have gained considerable insights into sensory abilities and brain reorganization in blind and sighted people following the use of visual-to-auditory or visual-to-tactile sensory-substitution devices. Harvard Medical School professor Alvaro Pascual-Leone and colleagues have suggested that changes in brain activity and structure in response to blindness may reflect an inherent ability of “visual” areas to process spatial information from other sensory modalities — thus, in a way, echolocation could be considered similar to vision at the behavioral and neural levels. It will be exciting to integrate results gained from studies investigating echolocation with those from studies using, for example, visual-to-auditory substitution devices to map similarities and differences across neuroplastic changes in the human brain. œ

References

Ammons, C. H., Worchel, P., & Dallenbach, K. M. (1953). “Facial vision”: The perception of obstacles out of doors by blindfolded and blindfolded-deafened subjects. The American Journal of Psychology, 66, 519–553.

Arias, C., & Ramos, O. A. (1997). Psychoacoustic tests for the study of human echolocation ability. Applied Acoustics, 51, 399–419.

Arnott, S. R., Thaler, L., Milne, J. L., Kish, D., & Goodale, M. A. (2013). Shape-specific activation of occipital cortex in an early blind echolocation expert. Neuropsychologia, 51, 938–949.

Ashmead, D. H., Wall, R. S., Eaton, S. B., Ebinger, K. A., Snook-Hill, M. M., Guth, D. A., & Yang, X. (1998). Echolocation reconsidered: Using spatial variations in the ambient sound field to guide locomotion. Journal of Visual Impairment and Blindness, 92, 615–632.

Buckingham, G., Milne, J. L., Byrne, C. M., & Goodale, M. A. (2014). The size-weight illusion induced through human echolocation. Psychological Science, 26, 237–242.

Carlson-Smith, C., & Wiener, W. R. (1996). The auditory skills necessary for echolocation: A new explanation. Journal of Visual Impairment and Blindness, 90, 21–35.

De Volder, A. G., Catalan-Ahumada, M., Robert, A., Bol, A., Labar, D., Coppens, A., … Veraart, C. (1999). Changes in occipital cortex activity in early blind humans using a sensory substitution device. Brain Research, 826, 128–134.

DeLong, C. M., Au, W. W. L., & Stamper, S. A. (2007). Echo features used by human listeners to discriminate among objects that vary in material or wall thickness: Implications for echolocating dolphins. The Journal of the Acoustical Society of America, 121, 605–617.

Dufour, A., Després, O., & Candas, V. (2005). Enhanced sensitivity to echo cues in blind subjects. Experimental Brain Research, 165, 515–519.

Fiehler, K., Schütz, I., Meller, T., & Thaler, L. (2015). Neural correlates of human echolocation of path direction during walking. Multisensory Research, 28, 195–226.

Griffin, D. R. (1944). Echolocation by blind men, bats, and radar. Science, 100, 589–590.

Kellogg, W. N. (1962). Sonar system of the blind. Science, 137, 399–404.

Kohler, I. (1964). Orientation by aural cues. American Foundation for the Blind Research Bulletin, 4, 14–53.

Kolarik, A. J., Cirstea, S., & Pardhan, S. (2013). Evidence for enhanced discrimination of virtual auditory distance among blind listeners using level and direct-to-reverberant cues. Experimental Brain Research, 224, 623–633.

Maidenbaum, S., Abboud, S., & Amedi, A. (2014). Sensory substitution: Closing the gap between basic research and widespread practical visual rehabilitation. Neuroscience & Biobehavioral Reviews, 41, 3–15.

Milne, J. L., Goodale, M. A., Arnott, S. R., Kish, D., & Thaler, L. (2015). Parahippocampal cortex is involved in material processing through echolocation in blind echolocation experts. Vision Research, 109, 139–148 doi:10.1016/j.visres.2014.07.004

Milne, J. L., Goodale, M. A., & Thaler, L. (2014). The role of head movements in the discrimination of 2-D shape by blind echolocation experts. Attention, Perception, & Psychophysics, 76, 1828–1837.

Muchnik, C., Efrati, M., Nemeth, E., Malin, M., & Hildesheimer, M. (1991). Central auditory skills in blind and sighted subjects. Scandinavian Audiology, 20, 19–23.

Noppeney, U. (2007). The effects of visual deprivation on functional and structural organization of the human brain. Neuroscience & Biobehavioral Reviews, 31, 1169–1180.

Papadopoulos, T., Edwards, D. S., Rowan, D., & Allen, R. (2011). Identification of auditory cues utilized in human echolocation — objective measurement results. Biomedical Signal Processing and Control, 6, 280–290.

Pascual-Leone, A., & Hamilton, R. (2001). The metamodal organization of the brain. Progress in Brain Research, 134, 427–445.

Renier, L., De Volder, A. G., & Rauschecker, J. P. (2014). Cortical plasticity and preserved function in early blindness. Neuroscience & Biobehavioral Reviews, 41, 53–63.

Rojas, J. A.M., Hermosilla, J. A., Montero, R. S., & Espí, P. L. L. (2009). Physical analysis of several organic signals for human echolocation: Oral vacuum pulses. Acta Acustica United With Acustica, 95, 325–330.

Rowan, D., Papadopoulos, T., Edwards, D., & Allen, R. (2015). Use of binaural and monaural cues to identify the lateral position of a virtual object using echoes. Hearing Research, 323, 32–39.

Rowan, D., Papadopoulos, T., Edwards, D., Holmes, H., Hollingdale, A., Evans, L., & Allen, R. (2013). Identification of the lateral position of a virtual object based on echoes by humans. Hearing Research, 300, 56–65.

Schenkman, B. N., & Nilsson, M. E. (2011). Human echolocation: Pitch versus loudness information. Perception, 40, 840–852.

Schörnich, S., Nagy, A., & Wiegrebe, L. (2012). Discovering your inner bat: Echo-acoustic target ranging in humans. Journal of the Association for Research in Otolaryngology, 13, 673–682.

Supa, M., Cotzin, M., & Dallenbach, K. M. (1944). “Facial vision”: The perception of obstacles by the blind. The American Journal of Psychology, 57, 133–183.

Surlykke, A., Ghose, K., & Moss, C. F. (2009). Acoustic scanning of natural scenes by echolocation in the big brown bat, Eptesicus fuscus. Journal of Experimental Biology, 212, 1011–1020.

Teng, S., Puri, A., & Whitney, D. (2012) Ultrafine spatial acuity of blind expert human echolocators. Experimental Brain Research, 216, 483–488.

Teng, S., & Whitney, D. (2011). The acuity of echolocation: Spatial resolution in sighted persons compared to expert performance. Journal of Visual Impairment and Blindness, 105, 20–32.

Thaler, L., Arnott, S. R., & Goodale, M. A. (2011). Neural correlates of natural human echolocation in early and late blind echolocation experts. PLoS ONE, 6, e20162. doi:10.1371/journal.pone.0020162

Thaler, L., Milne, J. L., Arnott, S. R., Kish, D., & Goodale, M. A. (2014). Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. Journal of Neurophysiology, 111, 112–127.

Vercillo, T., Milne, J. L., Gori, M., & Goodale, M. A. (2015). Enhanced auditory spatial localization in blind echolocators. Neuropsychologia, 67, 35–40.

Wallach, H., Newman, E. B., & Rosenzweig, M. R. (1949). A precedence effect in sound localization. The Journal of the Acoustical Society of America, 21, 468.

Wallmeier, L., Geßele, N., & Wiegrebe, L. (2013). Echolocation versus echo suppression in humans. Proceedings of the Royal Society B: Biological Sciences, 280, 20131428.

Wallmeier, L., Kish, D., Wiegrebe, L., & Flanagin, V. L. (2015). Aural localization of silent objects by active human biosonar: Neural representations of virtual echo‐acoustic space. European Journal of Neuroscience, 41, 533–545.

Wallmeier, L., & Wiegrebe, L. (2014a). Ranging in human sonar: Effects of additional early reflections and exploratory head movements. PLoS One, 9, e115363. doi:10.1371/journal.pone.0115363.

Wallmeier, L., & Wiegrebe, L. (2014b). Self-motion facilitates echo-acoustic orientation in humans. Royal Society Open Science, 1, 140185.

Worchel, P., & Dallenbach, K. M. (1947). “Facial vision”: Perception of obstacles by the deaf-blind. The American Journal of Psychology, 60, 502–553.

Worchel, P., & Mauney, J. (1951). The effect of practice on the perception of obstacles by the blind. Journal of Experimental Psychology, 41, 170–176.

Comments

How exactly would a human be able to make the correct high-pitched sound to be able to use echolocation and would they be able to do it?


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.