Deep Neural Networks in hearing aids!
What’s all the fuss about the deep neural network?
If you’ve been worrying about the potential for artificial intelligence – or AI – to take over the world, there’s one area where it can only be a force for good. That’s in hearing aids.
The latest high-end models use a deep neural network, or DNN, to deliver very clear speech recognition in noise – typically one of the hardest jobs of a hearing aid.
The DNN uses highly sophisticated, multi-layered modelling to process data in a way that’s inspired by the human brain. Once the system has been trained to recognise what it needs to, it can make predictions and solve problems using new data.
The Oticon Intent range of hearing aids use DNN technology that’s been trained with 12 million real life sound scenes. The DNN recognises all types of sound, their details, and how they should ideally sound. The hearing aid makes sounds more distinct and works across an endless variety of environments.
Phonak’s Audéo Sphere Infinio uses a DNN chip to provide exceptional speech clarity from any direction in any listening environment. By filtering out noise from speech you can engage with friends, family and colleagues in the noisiest environments. This is the first hearing aid with two chips inside to handle this extra processing. We are excited to see what this hearing aid will do! It launches on the 5th of September 2024 but reviews so far have been amazing! We will write again with a detailed update once we know more.