Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Issue title: Special section: Selected papers of LKE 2019
Guest editors: David Pinto, Vivek Singh and Fernando Perez
Article type: Research Article
Authors: Céspedes-Hernández, Davida; * | González-Calleros, Juan Manuela | Guerrero-García, Josefinaa | Vanderdonckt, Jeanb
Affiliations: [a] Facultad de Ciencias de la Computación, Benemérita Universidad Autónoma de Puebla, Mexico | [b] Université catholique de Louvain (UCLouvain), LouRIM. Place des Doyens, 1. B-1348 Louvain-la-Neuve, Belgium
Correspondence: [*] Corresponding author. David Céspedes-Hernández, Facultad de Ciencias de la Computación, Benemérita Universidad Autónoma de Puebla, Mexico. Tel. +52 12 345 678; Fax. +52 13 678 789; E-mail: [email protected].
Abstract: A gesture elicitation study consists of a popular method for eliciting a sample of end end users to propose gestures for executing functions in a certain context of use, specified by its users and their functions, the device or the platform used, and the physical environment in which they are working. Gestures proposed in such a study needs to be classified and, perhaps, extended in order to feed a gesture recognizer. To support this process, we conducted a full-body gesture elicitation study for executing functions in a smart home environment by domestic end users in front of a camera. Instead of defining functions opportunistically, we define them based on a taxonomy of abstract tasks. From these elicited gestures, a XML-compliant grammar for specifying resulting gestures is defined, created, and implemented to graphically represent, label, characterize, and formally present such full-body gestures. The formal notation for specifying such gestures is also useful to generate variations of elicited gestures to be applied on-the-fly on gestures in order to allow one-shot learning.
Keywords: Gesture elicitation study, gesture grammar, gesture recognition, gesture user interfaces, engineering interactive computing systems, one-shot learning
DOI: 10.3233/JIFS-179903
Journal: Journal of Intelligent & Fuzzy Systems, vol. 39, no. 2, pp. 2433-2444, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]