Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Issue title: Communicative social signals: Computational and behavioural aspects of human-human and human-machine interaction
Guest editors: Klára Vicsix and Anna Espositoy
Article type: Research Article
Authors: Navarretta, Costanza
Affiliations: Centre for Language Technology, University of Copenhagen, Njalsgade 140, build. 25 4th floor, 2300 Copenhagen S, Denmark. E-mail: [email protected] | [x] Laboratory of Speech Acoustics, Department of Telecommunications and Media-Informatics, Budapest University of Technology and Economics, Budapest, Hungary | [y] Dipartimento di Psicologia and IIASS, Seconda Università di Napoli, Vietri Sul Mare, Salerno, Italy
Abstract: This paper deals with multimodal behaviours in an annotated corpus of first encounters. More specifically, it presents a study aimed to determine common relations between facial expressions connected to emotions and co-speech. Emotions are described through emotion labels and bipolar values in three emotional dimensions, Pleasure, Arousal and Dominance, while the transcriptions of speech comprise words, quasi words, pauses, and filled pauses. The study establishes that there is a strong relation between specific communicative aspects, facial expressions and co-speech. We found that some emotions denoting facial expressions always co-occur with speech, and are often related to specific speech tokens, while other often occur uni-modally, that is without co-occurring speech. We have also observed large individual differences in the amount of facial expressions produced, but we did not find a correlation between the amount of speech tokens and facial expressions produced by the participants in the first encounters. Our study also confirms preceding research [31] suggesting that the most common emotions in the first encounters partly depend on the specific social activity. These findings are important for understanding human behaviours in face to face communications, but they also contribute to the construction and evaluation of corpus based models of plausible affective COGINFOCOM systems.
Keywords: Speech, emotions, facial expression, annotated multimodal data, communicative functions
DOI: 10.3233/IDT-140194
Journal: Intelligent Decision Technologies, vol. 8, no. 4, pp. 255-263, 2014
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]