Have you seen the movie “Finding Dory”? In one of the many exciting sequences, the protagonist Dory needs the help of Bailey the beluga whale to navigate through a maze of pipes. In a later, and even more exciting sequence, Bailey helps Dory keep track of a moving truck in which Dory’s friends Nemo and Merlin inadvertently became captured.
Bailey the beluga performs these miracles doing what he does every day—seeing with sound. In fact, Bailey and other toothed whales and dolphins have evolved echolocation skills that are more powerful than any man-made sonar. No wonder the movie’s makers chose to name Bailey’s exhibit “The World’s Most Powerful Pair of Glasses.” Whales and other marine mammals went back to the water and hence their hearing had to re-adopt to hearing under water. Echolocation is one of many interesting things about the auditory system of these species. For example, many of these species can hear up to or even above 100 kHz. In comparison, human hearing is limited to 20 kHz at best and most of the speech sounds we hear to communicate with each other are limited to 6 kHz. But today we are extolling the virtues of echolocation.
Bailey and other marine mammals are not the only animals that use echolocation. Bats and even some cave-dwelling birds use sound to see. Bailey and his other toothed whale and dolphin friends, however, have taken echolocation to great levels of sophistication. The basic principle is the same as in man-made sonar—produce a loud signal and listen for it to bounce back from an obstacle. Basic distance and size estimates of the reflector can then be derived from the characteristics of the reflection.
Just distance and size, though, are child’s play for the expert whales. Dolphins and whales not only discern distance and size of the reflecting object, they can differentiate between different shape reflectors. And we are talking about subtle differences here. Whales, working under positive reinforcement of getting a fish on a correct response, can learn to differentiate between a cylindrical and square metal rod a few inches wide. They also have demonstrated the ability to differentiate between reflectors of different textures and to estimate the speed of moving objects. In other words, these animals probably have the world’s most powerful pair of glasses.
When a whale produces a click for echolocation, it has no idea how far the click will have to travel before being reflected. The wise thing to do, under these circumstances, would be to produce the loudest possible click—and so it does. Would that not put the whale’s own hearing in danger?
In the answer to this question we find the second marvel of biology as it relates to echolocation. Whales and dolphins indeed protect their own hearing from harmful exposure when producing the echolocation clicks. These animals have evolved the ability to change the gain in their auditory system very rapidly once the echo-locating click is produced. This leads to two questions: first, how do we know that this really happens? And second, how does the animal do it?
We know that there is indeed active gain control in the auditory system in whales and dolphins from many measurements of the electrical activity in their brain (auditory brainstem response) from these animals during and after the production of echolocating clicks. The brainstem response is measured using a suction cup electrode placed on the animal’s head while the animal performs various experimental maneuvers, usually for a fish reward. These experiments are conducted typically in facilities that house rescued marine mammals or at large aquariums around the world. So, the data set is not large but the evidence is irrefutable. The size (amplitude) of the brainstem response is the smallest when the animal is producing the clicks and increases with time after. The observation of increasing brainstem response amplitude suggests that the animal has a way of inhibiting the response during click production, subsequently releasing this inhibition to be able to perceive the reflected echo signal. If that is not amazing enough, these animals seem to know physics well enough to halve the inhibition (double the gain) with every doubling of the distance to the target. It is important to note though that there is no evidence to suggest that these animals are able to exercise this same protective maneuver when the sound is not self-produced. We will return to the importance of this distinction between self-generated and external sounds later.
So, for what is the biological mechanism responsible? The short answer is that we do not know exactly, yet. Some propose that this is a simple case of a loud sound influencing the response to subsequent sounds even when the initial loud sound is no longer in the environment (technically called forward masking). One view could be that forward masking may be the psychophysical outcome of an underlying physiological process. The physiological underpinning of this phenomenon could well be driven via a top-down control mechanism of the ear by the brain. This is possible through a network of nerves collectively called the auditory efferent network. The auditory efferent nervous system provides downward connections from the brain to the cells in the inner ear responsible for receiving and amplifying external sounds appropriately (outer hair cells). It is possible and likely that the brain is able to change the gain provided by these outer hair cells depending on the level (volume) of the sound received. In contrast, bats control the intensity of self-generated echolocation signals reaching their ears by momentarily stiffening the middle ear and rejecting much of the incoming sound. The middle ear muscles are relaxed quickly after sound generation is complete, allowing normal transmission of external sounds into the ear. One way or the other, here is another example of a specialized auditory system that evolved to meet the specific needs of an animal operating in a specific environment. Can other animals learn to echolocate?
The answer to that question is definitely in the affirmative. Bats, as mentioned above, are famous for their use of echolocation to hunt in the darkness of the night. Even some cave-dwelling birds have rudimentary echolocating abilities. But what if I told you humans can see using sound as well.
Daniel Kish is not the only human able to echolocate, but he certainly is one of the foremost experts. Not only does he ride a bicycle in regular traffic using echolocation, he even hikes on trails by himself using the reflections of the clicks he produces to guide him around obstacles and keep himself on the trail.
Kish and others who can use echolocation to replace the function of vision can be thought of as sophisticated users of sound who learn to discern sizes, shapes, distance, terrain, and texture through sound. Some of them have developed these skills and understand the acoustics well enough to teach others to use sound to navigate their environment. So, how exactly do they do this?
It is highly likely that most blind people use passive echolocation to gather information about their environment. Daniel Kish, and others like him, however, actively produce clicks to sample their environment. These individuals have evolved astounding skills to see through sound and lead active and adventurous lives using their hearing to compensate for the vision they do not have.
Scientists started exploring the neural substrates of successful echolocation in these experts. In experiments using functional magnetic resonance imaging, scientists are discovering that the primary visual cortex is active during echolocation in blind experts. This suggests that neuroplastic changes allow redeployment of the visual areas of the brain to respond to acoustic stimuli—literally seeing through sound.
Humans and the Echolocating Underworld
While the general amazement about the echolocating abilities of these marine mammals is universal, the influence of human activity on the echolocating animals and their environment is much debated and less well accepted. At the heart of this controversy is the question whether noise created by commercial and naval activity in the oceans is precipitating a loss of hearing in marine mammals. This is of particular importance as hearing is the primary sense used by these animals to communicate, navigate, hunt, and protect themselves. A toothed whale or dolphin losing its hearing would be akin to a cheetah losing its eyes and legs.
The debate about human activity and whale or dolphin hearing reaches a crescendo every time there is a mass beach stranding of these animals. Those who believe that human-generated marine noise is harming the hearing of marine mammals argue that these strandings did not start until the 1960s when a more powerful naval sonar was developed and deployed. These strandings coincided often with naval exercises, and the strandings are different from others. In typical strandings, for example, a large group of pilot whales may be found to be beached together. These allegedly sonar-related strandings are different: beaked whales are found stranded individually or in small groups over a coast stretching several kilometers.
While the coincidence with naval exercises and atypical stranding patterns suggest a connection with human activity, opponents legitimately site the uncertainties in drawing a causal relationship. Because the strandings happen far away from the naval exercises with some time lag, there is no definite way of knowing exactly how close the stranded animals were to the naval exercises or the duration of their exposure.
Partly to understand the effect of human-generated noise on these marine mammals and partly to simply understand their behavior better, scientists started conducting more controlled observations and even prospective experiments. A group of Norwegian scientists recently deliberately exposed pilot whales and sperm whales to naval sonar while recording their diving behavior for hunting purposes. Their results suggest that exposure to naval sonar significantly alters the diving behavior of these animals, with the whales either not diving as deep, as often, or even not diving at all, for prolonged periods of time after the sonar exposure. These changes in diving behavior have direct implications on food intake for these whales.
Another piece of evidence linking human activity with marine mammal hearing and behavior comes from the observation that dolphins and mammals simplify their calls in noisy portions of our oceans. Scientists from the University of Maryland analyzed recordings of dolphin calls using underwater microphones off the coast of Maryland in the Atlantic Ocean. They observed that dolphins essentially simplified their calls by making them shorter and shifting their pitch higher in the noisiest portions of the ocean. An independent group of scientists in Japan recently made a similar observation. In the Pacific Ocean, a thousand kilometers south of Tokyo, they observed humpback whales shorten their songs whenever noisy ships went by. Clearly these animals are adapting to life in a noisy water-world.
Recent allowances for seismic exploration for oil and gas off the east coast of the United States will bring the debate of human-generated noise and marine mammals back to the forefront. Several news agencies, including The Guardian are reporting that the U.S. government gave go-ahead to a handful of private companies to explore the ocean floors for reservoirs of oil and gas. This exploration is to be done using seismic guns that generate pulses of very loud underwater sound. The sound travels through the water, penetrates the ocean floor, and detectors listen for reflections. Gas and oil pocket reflections have a distinct signature allowing the explorers to create a map of possible drill sites. Conservationists suggest that these seismic guns create some of the loudest sounds ever produced by humans. They fear that these sonic blasts will simply destroy the bottom layer of the ocean food chain and irreparably harm the hearing organs of the large marine mammals and other animals that depend on their hearing to survive. Proponents of these exploratory activities cite the thorough review of the proposal by the National Oceanic and Atmospheric Administration and their conclusion that the proposed seismic exploration is not expected to cause damage of any significance.
Yet again, we are faced with a choice between prosperity and conservation. The drive for energy independence in the USA is being pitted directly against the health and well-being of the marine environments around us. While the rest of us debate, the whales and dolphins are learning Morse code, so they can shorten and simplify their communication even more when the real head-banging rock and roll comes to town.
Sumitrajit Dhar, PhD, is an associate editor for Audiology Today and www.audiology.org. He serves as the chair of the Roxelyn and Richard Pepper Department of Communication Sciences and Disorders at Northwestern University in Evanston, Illinois, USA. He holds the Hugh Knowles Professorship in Hearing Science.
Related Posts
JAAA Editorial: Cognitive Bias in Audiology Research
Gary P. Jacobson, PhD Vol. 31, No. 2 (February 2020) • Access full issue online (members-only) By Gary P. Jacobson, PhD Editor-in-Chief The featured article in this…
JAAA Editorial: Clinical Case Reports: Striking the Right Balance
Vol. 31, No. 1 (January 2020) • Access full issue online (members-only) Devin L. McCaslin, PhD Deputy Editor-in-Chief Devin L. McCaslin, PhD Deputy Editor-in-Chief As both…
JAAA Editorial: The Effect of Loss of Speech Audibility on a Measure of Cognitive Function
Gary P. Jacobson, PhD Volume 30, Number 10, November/December 2019 Gary P. Jacobson, PhD Editor-in-Chief Journal of the American Academy of Audiology It has been estimated…