A Kinect-based Interactive Laboratory
碩士 === 國立中央大學 === 資訊工程學系 === 101 === This thesis presents a gesture-based interactive laboratory simulation system. The proposed system is consisted of two main modules. The first module is the “gesture recognition module” and the second one is the “interactive laboratory simulation module”. The pro...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2013
|
Online Access: | http://ndltd.ncl.edu.tw/handle/32983278641795053357 |
id |
ndltd-TW-101NCU05392105 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-101NCU053921052015-10-13T22:34:50Z http://ndltd.ncl.edu.tw/handle/32983278641795053357 A Kinect-based Interactive Laboratory 基於Kinect之互動實驗室 Chung-MIn Cha 查忠敏 碩士 國立中央大學 資訊工程學系 101 This thesis presents a gesture-based interactive laboratory simulation system. The proposed system is consisted of two main modules. The first module is the “gesture recognition module” and the second one is the “interactive laboratory simulation module”. The proposed system provides users with a simulation environment to conduct experiments with the real sense of the operations via gestures. The current proposed “Kinect-based interactive laboratory” is developed to simulate the chemical experiments designed for students at senior high-schools. Ten basic gestures which were induced from many elementary chemical experiments consist of the main action units for the simulation system. These ten gestures can be categorized into two action units according to their size range of activities: the large-scale gesture action unit and the small-scale gesture action unit. While the large-scale gesture action unit refers to the gestures which will involve in a wide range of hand movements during the operations of experiment equipment, the small-scale gesture action unit refers to the gestures of which activity scopes are confined to the palm portion. In this thesis, a Kinect sensor is utilized as a motion capture device, providing relevant gesture information to the "gesture recognition module”. Then this module adopts a two-stage approach to gesture recognition. First of all, this module classifies large-scale gestures based on the skeleton information. It will extract salient gesture features proposed by this thesis from hand movements. The radial basis function network (RBFN) is adopted as the gesture recognition unit for classifying the small-scale gestures. In the following, the "Interactive laboratory simulation module" provides a simple interactive interface to combine the recognition results achieved by the "gesture recognition module" to complete the simulation of chemical experiments. In addition, an authoring tool is designed for teachers to allow them to be able to easily add and modify the system for the preparations of the future experiments designed by them. In this thesis, many different aspects of experiments were design to verify the performance of the system. This system will first discuss the generalization degree of the system to non-specific users and then the system robustness degree to the changes in the operating environment. In the experiments of non-specific users’ generalization testing, the users were divided into the specific user group consisted of 3 subjects and the non-specific user group consisted of 8 subjects. The data collected from specific user group was used to train RBF networks. Then data collected from non-specific user group were utilized to test the generalization performance of the trained RBF networks. As for the verification of the system robustness to the environmental changes, the experiments were designed to explore whether the user's different standing locations and the viewing angles will result in the changes of the recognition rates. Experimental results showed that under the same environment, the recognition rate could achieve at least 97% and 96% correct for the specific user and non-specific group. As for the specific users under the different environment, the recognition rate varied from 90% to 96%. For the non-specific user, the recognition rate varied from 85% to 91% due to the changes in either the locations or the viewing angles. Therefore, the influence of the changes in the operating environment was more apparent than the changes in the users. Mu-Chun Su 蘇木春 2013 學位論文 ; thesis 105 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中央大學 === 資訊工程學系 === 101 === This thesis presents a gesture-based interactive laboratory simulation system. The proposed system is consisted of two main modules. The first module is the “gesture recognition module” and the second one is the “interactive laboratory simulation module”. The proposed system provides users with a simulation environment to conduct experiments with the real sense of the operations via gestures. The current proposed “Kinect-based interactive laboratory” is developed to simulate the chemical experiments designed for students at senior high-schools. Ten basic gestures which were induced from many elementary chemical experiments consist of the main action units for the simulation system. These ten gestures can be categorized into two action units according to their size range of activities: the large-scale gesture action unit and the small-scale gesture action unit. While the large-scale gesture action unit refers to the gestures which will involve in a wide range of hand movements during the operations of experiment equipment, the small-scale gesture action unit refers to the gestures of which activity scopes are confined to the palm portion.
In this thesis, a Kinect sensor is utilized as a motion capture device, providing relevant gesture information to the "gesture recognition module”. Then this module adopts a two-stage approach to gesture recognition. First of all, this module classifies large-scale gestures based on the skeleton information. It will extract salient gesture features proposed by this thesis from hand movements. The radial basis function network (RBFN) is adopted as the gesture recognition unit for classifying the small-scale gestures. In the following, the "Interactive laboratory simulation module" provides a simple interactive interface to combine the recognition results achieved by the "gesture recognition module" to complete the simulation of chemical experiments. In addition, an authoring tool is designed for teachers to allow them to be able to easily add and modify the system for the preparations of the future experiments designed by them.
In this thesis, many different aspects of experiments were design to verify the performance of the system. This system will first discuss the generalization degree of the system to non-specific users and then the system robustness degree to the changes in the operating environment. In the experiments of non-specific users’ generalization testing, the users were divided into the specific user group consisted of 3 subjects and the non-specific user group consisted of 8 subjects. The data collected from specific user group was used to train RBF networks. Then data collected from non-specific user group were utilized to test the generalization performance of the trained RBF networks. As for the verification of the system robustness to the environmental changes, the experiments were designed to explore whether the user's different standing locations and the viewing angles will result in the changes of the recognition rates. Experimental results showed that under the same environment, the recognition rate could achieve at least 97% and 96% correct for the specific user and non-specific group. As for the specific users under the different environment, the recognition rate varied from 90% to 96%. For the non-specific user, the recognition rate varied from 85% to 91% due to the changes in either the locations or the viewing angles. Therefore, the influence of the changes in the operating environment was more apparent than the changes in the users.
|
author2 |
Mu-Chun Su |
author_facet |
Mu-Chun Su Chung-MIn Cha 查忠敏 |
author |
Chung-MIn Cha 查忠敏 |
spellingShingle |
Chung-MIn Cha 查忠敏 A Kinect-based Interactive Laboratory |
author_sort |
Chung-MIn Cha |
title |
A Kinect-based Interactive Laboratory |
title_short |
A Kinect-based Interactive Laboratory |
title_full |
A Kinect-based Interactive Laboratory |
title_fullStr |
A Kinect-based Interactive Laboratory |
title_full_unstemmed |
A Kinect-based Interactive Laboratory |
title_sort |
kinect-based interactive laboratory |
publishDate |
2013 |
url |
http://ndltd.ncl.edu.tw/handle/32983278641795053357 |
work_keys_str_mv |
AT chungmincha akinectbasedinteractivelaboratory AT cházhōngmǐn akinectbasedinteractivelaboratory AT chungmincha jīyúkinectzhīhùdòngshíyànshì AT cházhōngmǐn jīyúkinectzhīhùdòngshíyànshì AT chungmincha kinectbasedinteractivelaboratory AT cházhōngmǐn kinectbasedinteractivelaboratory |
_version_ |
1718078085590417408 |