Our project ECOMODE (Event-Driven Compressive Vision for Multimodal Interaction with Mobile Devices) was selected to be funded under the H2020 programme (ICT-22-2014 on Multimodal and Natural computer interaction)!
ECOMODE aims at designing and implementing touch-less mobile devices usable also by older adults and visually impaired people. By exploiting the recently matured biologically-inspired technique of event-driven compressive sensing (EDC) of audio-visual information, interaction with ECOMODE technology is based on mid-air gestures and vision-assisted speech recognition.
Within ECOMODE, our team will focus on understanding the specificities and needs of older users with mild visual and speech disabilities due to age, implementing a User-Centered Inclusive Design process. The participation of the end-users in all the different phases of the design process will consist also in their involvement through iterative loops of evaluation and validation of the ECOMODE technology.
The project will start in January 2015 and will last 4 years, until December 2018.
The consortium is made of: