Historically, hearing aids have dealt with the problem of sound coming from multiple directions by filtering out certain sounds in favor of other, more important ones. Unfortunately, given how the brain processes sound, this can prove to be problematic. A great deal of our dynamic experience of sound is tied to the context that comes with contrasting inputs. This means that simply eliminating sounds can affect that natural flow of sound processing and result in a far less detailed auditory image of the environment. This is one of the reasons why the latest Oticon hearing aid represents such a fundamental leap forward. Rather than filtering out some sounds and leaving others behind, it works with and facilitates the brain’s normal modes by allowing it to follow and process several sounds sources all at the same time. The enhanced sound experience is tied to two different resources developed by the Oticon team: OpenSound Navigator and Spatial Sound LX. The former is capable of scanning the environment for sound inputs a staggering 100 times per second. It can then create a balanced auditory landscape out of this incredible store of real-time information.