Detecting engagement levels for autism intervention therapy using RGB-D camera
Our motivation for this work is to develop an autonomous robot system that is able to perform autism intervention therapy. Autism spectrum disorder (ASD) is a common type of neurodevelopmental disorder that affects millions of people in the United States alone. The best way of treating ASD and help...
Main Author: | |
---|---|
Other Authors: | |
Format: | Others |
Published: |
Georgia Institute of Technology
2016
|
Subjects: | |
Online Access: | http://hdl.handle.net/1853/55043 |
id |
ndltd-GATECH-oai-smartech.gatech.edu-1853-55043 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-GATECH-oai-smartech.gatech.edu-1853-550432016-06-28T03:34:23ZDetecting engagement levels for autism intervention therapy using RGB-D cameraGe, BiRoboticsAutismMachine learningOur motivation for this work is to develop an autonomous robot system that is able to perform autism intervention therapy. Autism spectrum disorder (ASD) is a common type of neurodevelopmental disorder that affects millions of people in the United States alone. The best way of treating ASD and help people with ASD learn new skills is through applied behavior analysis (ABA, i.e. autism intervention therapy). Because of the fact that people with ASD feel less stressful in a predictable and simple environment compared to interacting with other people and autism intervention therapy provided by professional therapists are generally expensive and inaccessible, it would be beneficial to build robots that can perform intervention therapy with children without a therapist/instructor present. In this research, we focus on the task of detecting engagement/disengagement levels of a child in a therapy session as a first step in designing a therapy robot. In this work, we mainly utilize an RGB-D camera, namely the Microsoft Kinect 2.0, to extract kinematic joint data from the therapy session. We also set up a child study with the Kid’s Creek therapy center to recruit children with ASD and record their interactions with a therapist while working on a touch-screen based game on a tablet. After carefully selecting features derived from skeletons’ movements and poses, we showed that our system can produce an accuracy of 97% when detecting engagements and disengagements using cross-validation assessment.Georgia Institute of TechnologyHoward, Ayanna2016-05-27T13:24:54Z2016-05-27T13:24:54Z2016-052016-05-03May 20162016-05-27T13:24:54ZThesisapplication/pdfhttp://hdl.handle.net/1853/55043 |
collection |
NDLTD |
format |
Others
|
sources |
NDLTD |
topic |
Robotics Autism Machine learning |
spellingShingle |
Robotics Autism Machine learning Ge, Bi Detecting engagement levels for autism intervention therapy using RGB-D camera |
description |
Our motivation for this work is to develop an autonomous robot system that is able to perform autism intervention therapy. Autism spectrum disorder (ASD) is a common type of neurodevelopmental disorder that affects millions of people in the United States alone. The best way of treating ASD and help people with ASD learn new skills is through applied behavior analysis (ABA, i.e. autism intervention therapy). Because of the fact that people with ASD feel less stressful in a predictable and simple environment compared to interacting with other people and autism intervention therapy provided by professional therapists are generally expensive and inaccessible, it would be beneficial to build robots that can perform intervention therapy with children without a therapist/instructor present. In this research, we focus on the task of detecting engagement/disengagement levels of a child in a therapy session as a first step in designing a therapy robot. In this work, we mainly utilize an RGB-D camera, namely the Microsoft Kinect 2.0, to extract kinematic joint data from the therapy session. We also set up a child study with the Kid’s Creek therapy center to recruit children with ASD and record their interactions with a therapist while working on a touch-screen based game on a tablet. After carefully selecting features derived from skeletons’ movements and poses, we showed that our system can produce an accuracy of 97% when detecting engagements and disengagements using cross-validation assessment. |
author2 |
Howard, Ayanna |
author_facet |
Howard, Ayanna Ge, Bi |
author |
Ge, Bi |
author_sort |
Ge, Bi |
title |
Detecting engagement levels for autism intervention therapy using RGB-D camera |
title_short |
Detecting engagement levels for autism intervention therapy using RGB-D camera |
title_full |
Detecting engagement levels for autism intervention therapy using RGB-D camera |
title_fullStr |
Detecting engagement levels for autism intervention therapy using RGB-D camera |
title_full_unstemmed |
Detecting engagement levels for autism intervention therapy using RGB-D camera |
title_sort |
detecting engagement levels for autism intervention therapy using rgb-d camera |
publisher |
Georgia Institute of Technology |
publishDate |
2016 |
url |
http://hdl.handle.net/1853/55043 |
work_keys_str_mv |
AT gebi detectingengagementlevelsforautisminterventiontherapyusingrgbdcamera |
_version_ |
1718325467652554752 |