Our project ECOMODE (Event-Driven Compressive Vision for Multimodal Interaction with Mobile Devices) was selected to be funded under the H2020 programme (ICT-22-2014 on Multimodal and Natural computer interaction)!
ECOMODE aims at designing and implementing touch-less mobile devices usable also by older adults and visually impaired people. By exploiting the recently matured biologically-inspired technique of event-driven compressive sensing (EDC) of audio-visual information, interaction with ECOMODE technology is based on mid-air gestures and vision-assisted speech recognition.
The project will start in January 2015 and will last 4 years, until December 2018.
The consortium is made of: