Use of Artificial Intelligence in the Making of Hearing Aids – Analytics Insight

Applications of artificial intelligence are growing every day in different sectors. There are numerous instances of AI applications in healthcare. The AI that occurs in hearing aids has actually been going on for years and the following is about how it happened. Hearing aids used to be relatively simple, he notes, but when hearing aids introduced a technology known as wide dynamic range compression (WDRC), the devices actually began to make a few decisions based on what is heard. For hearing aids to work effectively, they need to adapt to a persons individual hearing needs as well as all sorts of background noise environments. AI, machine learning, and neural networks are very good techniques to deal with such a complicated, nonlinear, multi-variable problem.

Researchers have been able to accomplish a lot with AI to date when it comes to improving hearing. For instance, researchers at the Perception and Neurodynamics Laboratory (PNL) at the Ohio State University trained a DNN to distinguish speech from other noise (such as humming and other background conversations). DeLiang Wang, professor of computer science and engineering at Ohio State University, in IEEE Spectrum has further explained People with hearing impairment could decipher only 29% of words muddled by babble without the program, but they understood 84% after the processing,

In recent years, major hearing aid manufacturers have been adding AI technology to their premium hearing aid models. For example, Widexs Moment hearing aid utilizes AI and machine learning to create hearing programs based on a wearers typical environments. Recently, Oticon introduced its newest hearing aid device, Oticon More, the first hearing aid with an onboard deep neural network. Oticon More has decided 12 million-plus real-life sounds so that people wearing it can better understand speech and the sounds around them. In a crowded place, Oticon Mores neural net receives a complicated layer of sounds, known as input. The DNN gets to work, first scanning and extracting simple sound elements and patterns from the input. It builds these element-powered her to recognize and make sense of whats happening. Lastly, the hearing aids then make a decision on how to balance the sound scene, making sure the output is clean and ideally balanced to the persons unique type of hearing loss. Speech and other sounds in the environment are complicated acoustic waveforms, but with unique patterns and structures that are exactly the sort of data deep learning is designed to analyze.

Hearing aids range widely in price, and some at the lower end have fewer AI-driven bells and whistles. Some patients may not need all the features, like the people who live alone or rarely leave the house find themselves in crowded scenarios often, for instance, might not benefit from the functionality found in higher-end models.

But for anyone who is out and about a lot, especially in situations where there are big soundscapes, AI-powered features allow for an improved hearing experience. The improvement of memory can be measured in a lot of more natural cater is memory recall. Its not that the hearing aids like Oticon More literally improve a persons memory, but artificial intelligence helps people spend less time trying to make sense of the noise around them, a process known as listening effort. When the listening effort is more natural, a person can focus more on the conversation and all the nuances conveyed within. So, the use of AI in hearing aids would help the brain work in a more natural way.

Share This ArticleDo the sharing thingy

More:
Use of Artificial Intelligence in the Making of Hearing Aids - Analytics Insight

Related Posts

Comments are closed.