Elderly Care Based on Hand Gestures Using Kinect Sensor
Technological advances have allowed hand gestures to become an important research field especially in applications such as health care and assisting applications for elderly people, providing a natural interaction with the assisting system through a camera by making specific gestures. In this study,...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-12-01
|
Series: | Computers |
Subjects: | |
Online Access: | https://www.mdpi.com/2073-431X/10/1/5 |
id |
doaj-2c5e7de899af4bcf994d03c1f6c41dab |
---|---|
record_format |
Article |
spelling |
doaj-2c5e7de899af4bcf994d03c1f6c41dab2020-12-27T00:01:25ZengMDPI AGComputers2073-431X2021-12-01105510.3390/computers10010005Elderly Care Based on Hand Gestures Using Kinect SensorMunir Oudah0Ali Al-Naji1Javaan Chahl2Electrical Engineering Technical College, Middle Technical University, Baghdad 10022, IraqElectrical Engineering Technical College, Middle Technical University, Baghdad 10022, IraqSchool of Engineering, University of South Australia, Mawson Lakes, SA 5095, AustraliaTechnological advances have allowed hand gestures to become an important research field especially in applications such as health care and assisting applications for elderly people, providing a natural interaction with the assisting system through a camera by making specific gestures. In this study, we proposed three different scenarios using a Microsoft Kinect V2 depth sensor then evaluated the effectiveness of the outcomes. The first scenario used joint tracking combined with a depth threshold to enhance hand segmentation and efficiently recognise the number of fingers extended. The second scenario utilised the metadata parameters provided by the Kinect V2 depth sensor, which provided 11 parameters related to the tracked body and gave information about three gestures for each hand. The third scenario used a simple convolutional neural network with joint tracking by depth metadata to recognise and classify five hand gesture categories. In this study, deaf-mute elderly people performed five different hand gestures, each related to a specific request, such as needing water, meal, toilet, help and medicine. Next, the request was sent via the global system for mobile communication (GSM) as a text message to the care provider’s smartphone because the elderly subjects could not execute any activity independently.https://www.mdpi.com/2073-431X/10/1/5elderly carehand gestureembedded systemKinect V2 depth sensorsimple convolutional neural network (SCNN), depth sensor |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Munir Oudah Ali Al-Naji Javaan Chahl |
spellingShingle |
Munir Oudah Ali Al-Naji Javaan Chahl Elderly Care Based on Hand Gestures Using Kinect Sensor Computers elderly care hand gesture embedded system Kinect V2 depth sensor simple convolutional neural network (SCNN), depth sensor |
author_facet |
Munir Oudah Ali Al-Naji Javaan Chahl |
author_sort |
Munir Oudah |
title |
Elderly Care Based on Hand Gestures Using Kinect Sensor |
title_short |
Elderly Care Based on Hand Gestures Using Kinect Sensor |
title_full |
Elderly Care Based on Hand Gestures Using Kinect Sensor |
title_fullStr |
Elderly Care Based on Hand Gestures Using Kinect Sensor |
title_full_unstemmed |
Elderly Care Based on Hand Gestures Using Kinect Sensor |
title_sort |
elderly care based on hand gestures using kinect sensor |
publisher |
MDPI AG |
series |
Computers |
issn |
2073-431X |
publishDate |
2021-12-01 |
description |
Technological advances have allowed hand gestures to become an important research field especially in applications such as health care and assisting applications for elderly people, providing a natural interaction with the assisting system through a camera by making specific gestures. In this study, we proposed three different scenarios using a Microsoft Kinect V2 depth sensor then evaluated the effectiveness of the outcomes. The first scenario used joint tracking combined with a depth threshold to enhance hand segmentation and efficiently recognise the number of fingers extended. The second scenario utilised the metadata parameters provided by the Kinect V2 depth sensor, which provided 11 parameters related to the tracked body and gave information about three gestures for each hand. The third scenario used a simple convolutional neural network with joint tracking by depth metadata to recognise and classify five hand gesture categories. In this study, deaf-mute elderly people performed five different hand gestures, each related to a specific request, such as needing water, meal, toilet, help and medicine. Next, the request was sent via the global system for mobile communication (GSM) as a text message to the care provider’s smartphone because the elderly subjects could not execute any activity independently. |
topic |
elderly care hand gesture embedded system Kinect V2 depth sensor simple convolutional neural network (SCNN), depth sensor |
url |
https://www.mdpi.com/2073-431X/10/1/5 |
work_keys_str_mv |
AT muniroudah elderlycarebasedonhandgesturesusingkinectsensor AT alialnaji elderlycarebasedonhandgesturesusingkinectsensor AT javaanchahl elderlycarebasedonhandgesturesusingkinectsensor |
_version_ |
1724370091046338560 |