Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition
碩士 === 國立中山大學 === 海下技術研究所 === 87 === Input devices play a key role in man-machine interfaces. It usually takes the operators some practice to get used to the input devices before they can command the system at wills. For more complex systems, e.g. an airplane, this training period can be significant...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
1999
|
Online Access: | http://ndltd.ncl.edu.tw/handle/93992799144842651630 |
id |
ndltd-TW-087NSYSU637006 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-087NSYSU6370062016-07-11T04:13:19Z http://ndltd.ncl.edu.tw/handle/93992799144842651630 Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition 特徵手與類神經網路分類器於手勢辨識之研究 Wang, Tian-Peir 王天培 碩士 國立中山大學 海下技術研究所 87 Input devices play a key role in man-machine interfaces. It usually takes the operators some practice to get used to the input devices before they can command the system at wills. For more complex systems, e.g. an airplane, this training period can be significant. In other words, operators are trained to accommodate the input devices. One alternative is to train the machines to understand the intention of the operators. Speech recognition is a typical case for such an idea. In this thesis we study the feasibility of gesture recognition as an input device to command ROV (Remotely Operated Vehicles) in the future. In order to have a more general case, in stead of creating our own gestures commands, we adopt sixteen different characters from Chinese sign language as command candidates. First, a homemade trigger is used to start the image grabbing program running on PC. Once a complete gesture is recorded (ranging from 20 frames for static gestures to 50 frames for dynamic gestures), the overlapped image is used to represent overall motion of the hand. All sixteen images are straightened as column vectors for eigenvalue/eigen vector analysis of their covariance matrix. Top seven eigenvalues and their corresponding eigen vectors are picked to span a subspace. These eigen images are called "Eigenhands" and are used to carry out the Principle Components Analysis (or Karhunen-Loeve Expansion). Unknown gesture is projected to the subspace and its coordinates in the subspace are the features we use for subsequent recognition analysis. We design a "Dual Classifier" architecture which includes both "Minimum Euclidean Distance Classifier" and "Hyper-sphere Classifier" and a decision tree rule to enhance the performance of system, particularly the recognition rate. Preliminary results show that our system achieves training recognition rate up to 96.6% and testing recognition rate up to 95.0% under 1600 samples. These results also indicate the feasibility of the system. Wang, Chau-Chang 王兆璋 1999 學位論文 ; thesis 61 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中山大學 === 海下技術研究所 === 87 === Input devices play a key role in man-machine interfaces. It usually takes the operators some practice to get used to the input devices before they can command the system at wills. For more complex systems, e.g. an airplane, this training period can be significant. In other words, operators are trained to accommodate the input devices. One alternative is to train the machines to understand the intention of the operators. Speech recognition is a typical case for such an idea. In this thesis we study the feasibility of gesture recognition as an input device to command ROV (Remotely Operated Vehicles) in the future.
In order to have a more general case, in stead of creating our own gestures commands, we adopt sixteen different characters from Chinese sign language as command candidates. First, a homemade trigger is used to start the image grabbing program running on PC. Once a complete gesture is recorded (ranging from 20 frames for static gestures to 50 frames for dynamic gestures), the overlapped image is used to represent overall motion of the hand. All sixteen images are straightened as column vectors for eigenvalue/eigen vector analysis of their covariance matrix. Top seven eigenvalues and their corresponding eigen vectors are picked to span a subspace. These eigen images are called "Eigenhands" and are used to carry out the Principle Components Analysis (or Karhunen-Loeve Expansion). Unknown gesture is projected to the subspace and its coordinates in the subspace are the features we use for subsequent recognition analysis.
We design a "Dual Classifier" architecture which includes both "Minimum Euclidean Distance Classifier" and "Hyper-sphere Classifier" and a decision tree rule to enhance the performance of system, particularly the recognition rate.
Preliminary results show that our system achieves training recognition rate up to 96.6% and testing recognition rate up to 95.0% under 1600 samples. These results also indicate the feasibility of the system.
|
author2 |
Wang, Chau-Chang |
author_facet |
Wang, Chau-Chang Wang, Tian-Peir 王天培 |
author |
Wang, Tian-Peir 王天培 |
spellingShingle |
Wang, Tian-Peir 王天培 Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
author_sort |
Wang, Tian-Peir |
title |
Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
title_short |
Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
title_full |
Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
title_fullStr |
Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
title_full_unstemmed |
Study of Applying Eigenhands and Neural Network Classifiers to Gesture Recognition |
title_sort |
study of applying eigenhands and neural network classifiers to gesture recognition |
publishDate |
1999 |
url |
http://ndltd.ncl.edu.tw/handle/93992799144842651630 |
work_keys_str_mv |
AT wangtianpeir studyofapplyingeigenhandsandneuralnetworkclassifierstogesturerecognition AT wángtiānpéi studyofapplyingeigenhandsandneuralnetworkclassifierstogesturerecognition AT wangtianpeir tèzhēngshǒuyǔlèishénjīngwǎnglùfēnlèiqìyúshǒushìbiànshízhīyánjiū AT wángtiānpéi tèzhēngshǒuyǔlèishénjīngwǎnglùfēnlèiqìyúshǒushìbiànshízhīyánjiū |
_version_ |
1718342564279484416 |