Douglas L. Beck, AuD, spoke with Dr. Kopco about interaural timing differences, distance hearing, reverberation, and "knowing where to listen matters."
Academy: Good morning, Noro. It's a pleasure to speak with you.
Kopco: Hi, Doug. Great to meet you, too.
Academy: Before we get to the topic of the day (distance hearing), I'd like to ask where you earned your doctorate and what your dissertation was on?
Kopco: Sure. I earned my PhD in 2003 from Boston University in the Cognitive and Neural Systems Department. My dissertation was on spatial hearing, auditory ability and pattern recognition in noisy environments.
Academy: Excellent. And I know your new paper was just published in the Proceedings of the National Academy of Sciences (PNAS, see references, below) and addressed a relatively new topic—distance hearing. That is, most audiologists are familiar with the three main acoustic cues related to spatial hearing—(1) interaural-level differences, (2) interaural timing differences, and (3) multiple concha/pinna effects). However, we rarely go deeper, and you've explored a fascinating aspect of spatial hearing that takes us beyond the acoustic cues noted earlier.
Kopco: Yes, we examined the ability to determine how far away sounds come from. That is, it's not just a matter of louder sounds coming from sources that are closer and quieter sounds from farther away, although that's generally true. Reverberation offers additional cues that allow the brain to discern how far away a sound is (i.e., its distance) independent of whether it is loud or soft.
So for the last 10 or 15 years, some researchers have wondered and written about how we know how far away a sound is. It turns out that reverberation works with the other acoustic cues such as loudness and binaural cues to provide distance information. Specifically, in a closed environment, we perceive reverb independent of loudness, and reflected sounds, in addition to the direct acoustic image, are instantaneously analyzed by the brain to generate a distance image.
Academy: These studies must be rather difficult to set up because you have to actually identify and separate out loudness from distance. How did you do that in the lab?
Kopco: We approached it by mixing "soft-but-nearby" and "loud-but-distant" sounds, which forces the listener to ignore loudness while judging distance of sounds. We used behavioral listening tasks in addition to functional magnetic resonance imaging (fMRI) and mathematical computations to gather and interpret the results.
Academy: I believe I read that you and your co-authors determined auditory processing of distance likely occurs in the superior temporal plane?
Kopco: Yes, this is what our results indicate. And this makes good sense as many other aspects of audition and spatial cues also appear to be processed in the same area. Specifically, the distance-sensitive area found by fMRI in our study is in the posterior non-primary auditory cortices.
Academy: And, as you just indicated, a bilateral auditory perception is required to interpret reverb cues secondary to loudness, and that makes sense with regard to classic localization and spatial hearing.
Kopco: Yes, I would agree. Even though, it is not clear at this point how exactly the brain extracts the reverberation information from the sounds.
Academy: This is fascinating work. And if I may speculate a little, frankly, your work reinforces my own (often repeated) belief that "knowing where to listen matters." That is, I believe the primary reason people do not do well with traditional hearing aids in noisy environments is standard compression circuits (such as wide dynamic range compression, WDRC) generally squash the interaural loudness differences (ILDs)—and so because the brain does not know where to attend (due to missing ILDs), the brain cannot determine the location of the sound, resulting in the most common complaint of all hearing aid wearers—the inability to listen well in noise.
Kopco: I totally agree. I also think that it's a little speculative (as you indicated) but it does make good sense. I think that to do well in noise, hearing aids and other amplification systems need to maintain the natural acoustic cues (such as ILDs, ITDs, etc.), which normal hearing people perceive.
Academy: Noro, congratulations on your work and your PNAS publication.
Kopco: Thanks, Doug. I appreciate your interest in my work.
Norbert Kopco, PhD, is with the Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Harvard Medical School in Boston.
Douglas L. Beck, AuD, Board Certified in Audiology, is the Web content editor for the American Academy of Audiology.
For More Information, References, and Recommendations
Arbogast TL, Mason CR, Kidd G. (2005) The effect of spatial separation on informational masking of speech in normal-hearing and hearing-impaired listeners. Journal of the Acoustical Society of America 117:2169-2180.
Beck DL, Sockalingam R. (2010) Facilitating spatial hearing through advanced hearing aid technology. Hearing Review April.
Beck DL, Flexer C. (2011) Listening is where hearing meets brain. Hearing Review February.
Culling JF, Akeroyd MA. (2010) "Spatial Hearing." In The Oxford Handbook of Auditory Science: Hearing. Chief Editor David R. Moore. Volume 3. Editor Christopher J. Plack. Oxford University Press.
Kopčo N, Huang S, Belliveau JW, Raij T, Tengshe C, Ahveninen J. (2012) Neuronal Representations of Distance in Human Auditory Cortex. Proceedings of the National Academy of Sciences of USA, 109 (27), 11019-11024.
Kopčo N, Shinn-Cunningham BG. (2011) Effect of stimulus spectrum on distance perception for nearby sources. Journal of the Acoustical Society of America 130(3):1530-1541