Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Sharma, Vinay Krishnaa | Murthy, L.R.D.a | Singh Saluja, KamalPreeta | Mollyn, Vimalb | Sharma, Gouravc | Biswas, Pradiptaa; *
Affiliations: [a] Indian Institute of Science, Bangalore, India | [b] Indian Institute of Technology, Madras, India | [c] Indian Institute of Information Technology, Kalyani, India
Correspondence: [*] Corresponding author: Pradipta Biswas, Indian Institute of Science, Bangalore, India. %****␣tad-32-tad200264_temp.tex␣Line␣25␣**** E-mail: [email protected].
Abstract: BACKGROUND: People with severe speech and motor impairment (SSMI) often uses a technique called eye pointing to communicate with outside world. One of their parents, caretakers or teachers hold a printed board in front of them and by analyzing their eye gaze manually, their intentions are interpreted. This technique is often error prone and time consuming and depends on a single caretaker. OBJECTIVE: We aimed to automate the eye tracking process electronically by using commercially available tablet, computer or laptop and without requiring any dedicated hardware for eye gaze tracking. The eye gaze tracker is used to develop a video see through based AR (augmented reality) display that controls a robotic device with eye gaze and deployed for a fabric printing task. METHODS: We undertook a user centred design process and separately evaluated the web cam based gaze tracker and the video see through based human robot interaction involving users with SSMI. We also reported a user study on manipulating a robotic arm with webcam based eye gaze tracker. RESULTS: Using our bespoke eye gaze controlled interface, able bodied users can select one of nine regions of screen at a median of less than 2 secs and users with SSMI can do so at a median of 4 secs. Using the eye gaze controlled human-robot AR display, users with SSMI could undertake representative pick and drop task at an average duration less than 15 secs and reach a randomly designated target within 60 secs using a COTS eye tracker and at an average time of 2 mins using the webcam based eye gaze tracker. CONCLUSION: The proposed system allows users with SSMI to manipulate physical objects without any dedicated eye gaze tracker. The novelty of the system is in terms of non-invasiveness as earlier work mostly used glass based wearable trackers or head/face tracking but no other earlier work reported use of webcam based eye tracking for controlling robotic arm by users with SSMI.
Keywords: Eye gaze tracking, assistive technology, human robot interaction, SSMI
DOI: 10.3233/TAD-200264
Journal: Technology and Disability, vol. 32, no. 3, pp. 179-197, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]