A Study of Gazing Estimation Using Active Appearance Model

碩士 === 國立臺灣科技大學 === 資訊工程系 === 99 === In recent years, human-computer interaction has become more popular research. Most methods use body movements, gestures and eye gaze direction as the basis for interaction. Gazing estimation is still an active research domain in recent years. In this paper, we pr...

Full description

Bibliographic Details
Main Authors: Chun-Tsai Yeh, 葉俊材
Other Authors: Yi-Leh Wu
Format: Others
Language:en_US
Published: 2011
Online Access:http://ndltd.ncl.edu.tw/handle/8k4k9j
Description
Summary:碩士 === 國立臺灣科技大學 === 資訊工程系 === 99 === In recent years, human-computer interaction has become more popular research. Most methods use body movements, gestures and eye gaze direction as the basis for interaction. Gazing estimation is still an active research domain in recent years. In this paper, we present an efficient way to solve the problem of the eye gaze point. We locate the eye region by modifying the characteristics of the Active Appearance Model (AAM). The by employing the Support Vector Machine (SVM), we estimate the five gazing directions through classification. The original 68 facial feature points in AAM are modified into 36 eye feature points. According to the two-dimensional coordinates of feature points, we classify different directions of eye gazing. The modified 36 feature points describe the contour of eye, iris size, iris location, and the position of pupil. In addition, the resolution of cameras does not affect our method to determine the direction of line of sight accurately. The final results show the independence of classifications, classification error, and accurate estimation of the gazing directions.