Brain-Computer Interfaces as intelligent sensors for enhancing human-computer interaction
Abstract
The 14th ACM International Conference on Multimodal Interaction (ICMI) has launched its grand challenges. One of them is baptized “Brain-Computer Interfaces as intelligent sensors for enhancing human-computer interaction”. This challenge calls for papers, posters, demonstrators, and videos. It aims to explore the integration of traditional multimodal interaction with physiological computing (i.e., systems that use data from the human nervous system as control input to a technological system); in particular, Brain-Computer Interfaces (BCI). We propose to change the conceptual use of “BCI as an actor” (input control) into “BCI as an intelligent sensor” (monitor). This shift of emphasis promotes the capacity of BCI to represent spontaneous changes in the state of the user in order to induce intelligent adaptation at the interface. BCIs can be increasingly used as intelligent sensors which “read” passive signals from the nervous system and infer user states to adapt Ambient Intelligent (AmI) environments and facilitate personalized and ubiquitous computing.