Skip to Main Content
virtual Seminar/Symposium

Imagine having a sound mixing board for real life: You could turn up the voice of a conversation partner, silence the drone of an airplane engine, or remove echoes from the public address system in a train station. This seminar will describe systems and algorithms for augmented listening technologies, like hearing aids and augmented reality systems, that can enhance our perception of sound and even provide superhuman hearing.

Zoom Link

Abstract:

Imagine having a sound mixing board for real life: You could turn up the voice of a conversation partner, silence the drone of an airplane engine, or remove echoes from the public address system in a train station. This seminar will describe systems and algorithms for augmented listening technologies, like hearing aids and augmented reality systems, that can enhance our perception of sound and even provide superhuman hearing. Sensory augmentation lies at the intersection of artificial intelligence and human intelligence, and it addresses two key problems. First, we must design sensing and inference technologies that surpass normal human senses, for example by using large microphone arrays or acoustic sensor networks. Second, we must make that enhanced information useful to human listeners, carefully preserving the spatial, temporal, and spectral cues that the human auditory system uses to interpret sound. By combining large-scale sensing systems with human-centric AI, we can build listening systems that perform well even in noisy, dynamic environments where current devices fail.

 

Biosketch:

Ryan M. Corey is a postdoctoral research fellow at the University of Illinois Urbana-Champaign and the Discovery Partners Institute. A hearing aid user since he was a teenager, Dr. Corey studies systems and algorithms to improve the performance of listening technology in noisy environments. He received the B.S.E. degree from Princeton University and the M.S. and Ph.D. degrees from the University of Illinois Urbana-Champaign, all in electrical engineering. Since 2017, he has supervised a team of engineering and design students in the Illinois Augmented Listening Laboratory. Dr. Corey has received the National Science Foundation Graduate Research Fellowship, the Microsoft AI for Accessibility grant, the Intelligence Community Postdoctoral Research Fellowship, and best paper awards for his work on wearable microphone arrays and dynamic range compression algorithms.