A team of researchers from the Ivcher Institute for Brain, Cognition, and Technology (BCT Institute) at Reichman University (Herzliya, Israel) has identified a significant deficit in auditory spatial perception among hearing aid users and cochlear implant recipients and introduced an innovative multisensory solution that leads to notable improvements in this ability.
The research, recently published in the prestigious journal iScience, highlights the challenges faced by individuals born with hearing impairment, even after years of hearing aid use or cochlear implantation.
“We intentionally chose people born with congenital auditory deprivation because these individuals offer a unique opportunity to explore how sensory representations are formed,” explains Dr. Adi Snir, first author of the paper and postdoctoral fellow at the BCT Institute. “What we found were severely impaired auditory spatial capabilities, with significant difficulty in localizing sounds, especially when tracking moving sound sources, even among individuals with bilateral cochlear implants,” adds Snir.
The auditory system constantly compares the arrival times and levels of sounds, quickly calculating differences to locate the positions of sound sources. “With hearing impairment, there is a change in the frequency ranges and resolution of perceived hearing, often compounded by additional distortions caused by hearing devices, which interfere with this process, particularly for moving sounds,” explains Dr. Katarzyna Cieśla-Seifer, co-author of the paper and postdoctoral fellow at the BCT Institute and the Institute of Physiology and Pathology of Hearing in Warsaw.
Inspired by the function of the auditory system, the researchers developed a technology (Touch Motion Algorithm - TMA), which delivers tactile feedback through the fingertips, performing intensity adjustments to represent external spatial positions and movement. “We wanted to test whether we could represent spatial information in a way that reflects the auditory system’s processes, but using an alternative sensory modality—in this case, touch,” Snir explains.
The participants quickly developed the ability to sense spatial cues through tactile feedback and achieved near-normal accuracy in localization tasks. The team also explored the effect of combining auditory and tactile inputs, with participants reporting that the combination significantly simplified the localization task.
“The feedback we received from participants emphasizes the value of integrating tactile cues to improve the experience for people with hearing impairments,” notes Dr. Cieśla-Seifer. “It also underscores the importance of multisensory integration in auditory rehabilitation.”
“These findings are particularly exciting because they have implications for understanding how spatial representations develop in the brain,” says Prof. Amir Amedi, senior author of the study. “Hearing is the only modality capable of naturally representing our entire 3D environment, so lacking access to this information from birth might cause some deficits in spatial abilities for this population.”
However, the researchers found that participants were able to rapidly perform spatial tasks using tactile cues, underscoring the remarkable learning capacity of the adult human brain and supporting the idea that spatial representation is not necessarily tied to any specific sensory modality.
The study’s findings carry profound implications for the future of sensory rehabilitation. “By harnessing tactile feedback, the potential to enhance existing assistive technologies could significantly improve the quality of life for millions of people worldwide,” explains Prof. Amir Amedi, Founding Director of the BCT Institute
Researchers Reveal Breakthrough in Technology That Helps Hearing-Impaired Individuals Hear Through Their Fingers
Significant Improvements in Auditory Spatial Perception Through Innovative Tactile Technology
A team of researchers from the Ivcher Institute for Brain, Cognition, and Technology (BCT Institute) at Reichman University (Herzliya, Israel) has identified a significant deficit in auditory spatial perception among hearing aid users and cochlear implant recipients and introduced an innovative multisensory solution that leads to notable improvements in this ability.
The research, recently published in the prestigious journal iScience, highlights the challenges faced by individuals born with hearing impairment, even after years of hearing aid use or cochlear implantation.
“We intentionally chose people born with congenital auditory deprivation because these individuals offer a unique opportunity to explore how sensory representations are formed,” explains Dr. Adi Snir, first author of the paper and postdoctoral fellow at the BCT Institute. “What we found were severely impaired auditory spatial capabilities, with significant difficulty in localizing sounds, especially when tracking moving sound sources, even among individuals with bilateral cochlear implants,” adds Snir.
The auditory system constantly compares the arrival times and levels of sounds, quickly calculating differences to locate the positions of sound sources. “With hearing impairment, there is a change in the frequency ranges and resolution of perceived hearing, often compounded by additional distortions caused by hearing devices, which interfere with this process, particularly for moving sounds,” explains Dr. Katarzyna Cieśla-Seifer, co-author of the paper and postdoctoral fellow at the BCT Institute and the Institute of Physiology and Pathology of Hearing in Warsaw.
Inspired by the function of the auditory system, the researchers developed a technology (Touch Motion Algorithm - TMA), which delivers tactile feedback through the fingertips, performing intensity adjustments to represent external spatial positions and movement. “We wanted to test whether we could represent spatial information in a way that reflects the auditory system’s processes, but using an alternative sensory modality—in this case, touch,” Snir explains.
The participants quickly developed the ability to sense spatial cues through tactile feedback and achieved near-normal accuracy in localization tasks. The team also explored the effect of combining auditory and tactile inputs, with participants reporting that the combination significantly simplified the localization task.
“The feedback we received from participants emphasizes the value of integrating tactile cues to improve the experience for people with hearing impairments,” notes Dr. Cieśla-Seifer. “It also underscores the importance of multisensory integration in auditory rehabilitation.”
“These findings are particularly exciting because they have implications for understanding how spatial representations develop in the brain,” says Prof. Amir Amedi, senior author of the study. “Hearing is the only modality capable of naturally representing our entire 3D environment, so lacking access to this information from birth might cause some deficits in spatial abilities for this population.”
However, the researchers found that participants were able to rapidly perform spatial tasks using tactile cues, underscoring the remarkable learning capacity of the adult human brain and supporting the idea that spatial representation is not necessarily tied to any specific sensory modality.
The study’s findings carry profound implications for the future of sensory rehabilitation. “By harnessing tactile feedback, the potential to enhance existing assistive technologies could significantly improve the quality of life for millions of people worldwide,” explains Prof. Amir Amedi, Founding Director of the BCT Institute
Journal
iScience
Method of Research
Experimental study
Subject of Research
People
Article Title
Highly compromised auditory spatial perception in congenitally hearing-impaired individuals and rapid improvement with spatial tactile technology
Article Publication Date
20-Sep-2024