Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots

Assistive robots support people with limited mobility in their everyday life activities and work. However, most of the assistive systems and technologies for supporting eating and drinking require a residual mobility in arms or hands. For people without residual mobility, different hands-free contro...

Full description

Bibliographic Details
Main Authors: Sarah Stalljann, Lukas Wöhle, Jeroen Schäfer, Marion Gebhard
Format: Article
Language:English
Published: MDPI AG 2020-12-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/20/24/7162
id doaj-1d9c34f56fae4228abc2d09140be14ad
record_format Article
spelling doaj-1d9c34f56fae4228abc2d09140be14ad2020-12-15T00:02:43ZengMDPI AGSensors1424-82202020-12-01207162716210.3390/s20247162Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive RobotsSarah Stalljann0Lukas Wöhle1Jeroen Schäfer2Marion Gebhard3Group of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, GermanyGroup of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, GermanyGroup of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, GermanyGroup of Sensors and Actuators, Department of Electrical Engineering and Applied Physics, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, GermanyAssistive robots support people with limited mobility in their everyday life activities and work. However, most of the assistive systems and technologies for supporting eating and drinking require a residual mobility in arms or hands. For people without residual mobility, different hands-free controls have been developed. For hands-free control, the combination of different modalities can lead to great advantages and improved control. The novelty of this work is a new concept to control a robot using a combination of head and eye motions. The control unit is a mobile, compact and low-cost multimodal sensor system. A Magnetic Angular Rate Gravity (MARG)-sensor is used to detect head motion and an eye tracker enables the system to capture the user’s gaze. To analyze the performance of the two modalities, an experimental evaluation with ten able-bodied subjects and one subject with tetraplegia was performed. To assess discrete control (event-based control), a button activation task was performed. To assess two-dimensional continuous cursor control, a Fitts’s Law task was performed. The usability study was related to a use-case scenario with a collaborative robot assisting a drinking action. The results of the able-bodied subjects show no significant difference between eye motions and head motions for the activation time of the buttons and the throughput, while, using the eye tracker in the Fitts’s Law task, the error rate was significantly higher. The subject with tetraplegia showed slightly better performance for button activation when using the eye tracker. In the use-case, all subjects were able to use the control unit successfully to support the drinking action. Due to the limited head motion of the subject with tetraplegia, button activation with the eye tracker was slightly faster than with the MARG-sensor. A further study with more subjects with tetraplegia is planned, in order to verify these results.https://www.mdpi.com/1424-8220/20/24/7162assistive technologymotion sensorseye trackerMARGtetraplegiaFitts’ Law
collection DOAJ
language English
format Article
sources DOAJ
author Sarah Stalljann
Lukas Wöhle
Jeroen Schäfer
Marion Gebhard
spellingShingle Sarah Stalljann
Lukas Wöhle
Jeroen Schäfer
Marion Gebhard
Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
Sensors
assistive technology
motion sensors
eye tracker
MARG
tetraplegia
Fitts’ Law
author_facet Sarah Stalljann
Lukas Wöhle
Jeroen Schäfer
Marion Gebhard
author_sort Sarah Stalljann
title Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
title_short Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
title_full Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
title_fullStr Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
title_full_unstemmed Performance Analysis of a Head and Eye Motion-Based Control Interface for Assistive Robots
title_sort performance analysis of a head and eye motion-based control interface for assistive robots
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2020-12-01
description Assistive robots support people with limited mobility in their everyday life activities and work. However, most of the assistive systems and technologies for supporting eating and drinking require a residual mobility in arms or hands. For people without residual mobility, different hands-free controls have been developed. For hands-free control, the combination of different modalities can lead to great advantages and improved control. The novelty of this work is a new concept to control a robot using a combination of head and eye motions. The control unit is a mobile, compact and low-cost multimodal sensor system. A Magnetic Angular Rate Gravity (MARG)-sensor is used to detect head motion and an eye tracker enables the system to capture the user’s gaze. To analyze the performance of the two modalities, an experimental evaluation with ten able-bodied subjects and one subject with tetraplegia was performed. To assess discrete control (event-based control), a button activation task was performed. To assess two-dimensional continuous cursor control, a Fitts’s Law task was performed. The usability study was related to a use-case scenario with a collaborative robot assisting a drinking action. The results of the able-bodied subjects show no significant difference between eye motions and head motions for the activation time of the buttons and the throughput, while, using the eye tracker in the Fitts’s Law task, the error rate was significantly higher. The subject with tetraplegia showed slightly better performance for button activation when using the eye tracker. In the use-case, all subjects were able to use the control unit successfully to support the drinking action. Due to the limited head motion of the subject with tetraplegia, button activation with the eye tracker was slightly faster than with the MARG-sensor. A further study with more subjects with tetraplegia is planned, in order to verify these results.
topic assistive technology
motion sensors
eye tracker
MARG
tetraplegia
Fitts’ Law
url https://www.mdpi.com/1424-8220/20/24/7162
work_keys_str_mv AT sarahstalljann performanceanalysisofaheadandeyemotionbasedcontrolinterfaceforassistiverobots
AT lukaswohle performanceanalysisofaheadandeyemotionbasedcontrolinterfaceforassistiverobots
AT jeroenschafer performanceanalysisofaheadandeyemotionbasedcontrolinterfaceforassistiverobots
AT mariongebhard performanceanalysisofaheadandeyemotionbasedcontrolinterfaceforassistiverobots
_version_ 1724383100942680064