You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Unobtrusive emotion sensing and interpretation in smart environment


Currently, a particular focus of human centered technology is in expanding traditional contextual sensing and smart processing capabilities of ubiquitous systems exploiting user's affective and emotional states to develop more natural communication between computing artefacts and users. This paper presents a smart environment of Web services that has been developed to integrate and manage different existing and new emotion sensing applications, which working together provide tracking and recognition of human affective state in real time. In addition, two emotion interpreters based on the proposed 6-FACS and Distance models have been developed. Both models operate with encoded facial deformations described either in terms of Ekman's Action Units or Facial Animation Parameters of MPEG-4 standards. Fuzzy inference system based on reasoning model implemented in a knowledge base has been used for quantitative measurement and recognition of three-level intensity of basic and non-prototypical facial expressions. Designed frameworks integrated to smart environment have been tested in order to evaluate capability of the proposed models to extract and classify facial expressions providing precision of interpretation of basic emotions in range of 65–96% and non-prototypical emotions in range of 55–65%. The conducted tests confirm that such basic as non-prototypical expressions may be composed by other basic emotions establishing in this way the concordance between existing psychological models of emotions and Ekman's model traditionally used by affective computing applications.