Cognitive Control of a Hearing Aid Horizon 2020


The Cognitive Control of a Hearing Aid (COCOHA) Horizon 2020 project has the ambitious aim of creating a hearing-aid system that can be mentally (cognitively) steered by the user. Advanced signal processing techniques, such as beamforming and source separation, have the potential of isolating and enhancing individual sound sources in a complex acoustic environment. However, selecting which source to enhance (i.e. steering the hearing-aid) presents a major challenge. In this project, methods and algorithms will be developed to decode brain signals picked up by EEG electrodes, and to extract attention and intention signals, matching them to acoustic sources in the environment. The hearing aid will then identify and enhance the sound source that is being attended. The results of the project are expected to shed light on the mechanisms and limits of auditory attention, and to provide a first step towards a wider application of advanced brain-computer interfaces (BCIs) in controlling prosthetic sensory systems.

COCOHA is supporting the projects

Characterizing neural mechanisms of attention-driven speech processing
by PhD student Søren Fuglsang

Controlling a hearing aid by electrically assessed eye-gaze
by PhD student Antoine Favre-Félix

EEG measures of attention in normal hearing and hearing impaired
by Research Assistant Jonatan Marcher-Rørsted

Attentional switching in a competitive speaker scenario
by Michael Noes Kiel Andersen (COCOHA) Research Assistant

Auditory frequency tagging for attention controlled Brain-Computer Interface
by Master student Sandra Solli

Real-time attention control of auditory feedback
by Master student Søren Vørnle Nielsen