Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers

碩士 === 國立中興大學 === 電機工程學系所 === 104 === Recently, wearable devices provide an easy and convenient human-computer interaction in our life directly. In this thesis, a low cost wearable eye tracker system is implemented. After the user completes the calibration process, the proposed system allows head...

Full description

Bibliographic Details
Main Authors: Jia-Hao Wu, 吳家豪
Other Authors: Chih-Peng Fan
Format: Others
Language:zh-TW
Published: 2016
Online Access:http://ndltd.ncl.edu.tw/handle/64851290203988761120
id ndltd-TW-104NCHU5441063
record_format oai_dc
spelling ndltd-TW-104NCHU54410632017-01-08T04:17:52Z http://ndltd.ncl.edu.tw/handle/64851290203988761120 Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers 應用於穿戴式即時眼動儀的雙模式瞳孔與虹膜視線追蹤系統 Jia-Hao Wu 吳家豪 碩士 國立中興大學 電機工程學系所 104 Recently, wearable devices provide an easy and convenient human-computer interaction in our life directly. In this thesis, a low cost wearable eye tracker system is implemented. After the user completes the calibration process, the proposed system allows head free and estimates the gaze points simultaneously and correctly. In the thesis, we propose the dual-mode pupil and iris gaze tracking systems which use one camera to capture the eye image and use the other camera to capture the scene image. The infrared based camera for eye images can segment pupil and sclera information, or the user can turn off the infrared function and change to visible light mode to capture the iris manually. Then the system changes to ellipse fitting and estimates the pupil and iris centers. In addition, the scene camera captures image from user''s view as the eye of user. The proposed algorithm is divided into: capture eye images, search ROI of pupil (or iris), pupil (or iris) segmentation, ellipse fitting with the RANSAC based process, and catch calibration points for calculating gaze placement. In our experiment, we utilize 9 calibration points to increase the accuracy compared with 4 calibration points. The system performs some distortions when the user gazes the corner points of square boundary. Since the human eyes are stereo-sphere shape, the coordinates cannot be a square when the gaze points project from 3-dimensions space to a 2-dimensions image. By implementing on the PC platform with Intel i7-3770 at 3.4GHz operational frequency, our experimental results show that the offset of pupil center are both within 2 pixels for horizontal and vertical coordinates. By using visible light for tracking iris technology, its center offset of horizontal is 7 pixels, and the vertical offset is 5 pixels. Through the perspective function, the system achieves the accuracy between 0.1° and 1.46° for horizontal coordinates, and 1.09° and 2.68° for vertical coordinates for the gaze placement error. Besides, the system achieves the accuracy between 0.85° and 1.51° for horizontal coordinates, and 1.53° and 3.56° for vertical coordinates for the gaze placement error. Chih-Peng Fan 范志鵬 2016 學位論文 ; thesis 77 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立中興大學 === 電機工程學系所 === 104 === Recently, wearable devices provide an easy and convenient human-computer interaction in our life directly. In this thesis, a low cost wearable eye tracker system is implemented. After the user completes the calibration process, the proposed system allows head free and estimates the gaze points simultaneously and correctly. In the thesis, we propose the dual-mode pupil and iris gaze tracking systems which use one camera to capture the eye image and use the other camera to capture the scene image. The infrared based camera for eye images can segment pupil and sclera information, or the user can turn off the infrared function and change to visible light mode to capture the iris manually. Then the system changes to ellipse fitting and estimates the pupil and iris centers. In addition, the scene camera captures image from user''s view as the eye of user. The proposed algorithm is divided into: capture eye images, search ROI of pupil (or iris), pupil (or iris) segmentation, ellipse fitting with the RANSAC based process, and catch calibration points for calculating gaze placement. In our experiment, we utilize 9 calibration points to increase the accuracy compared with 4 calibration points. The system performs some distortions when the user gazes the corner points of square boundary. Since the human eyes are stereo-sphere shape, the coordinates cannot be a square when the gaze points project from 3-dimensions space to a 2-dimensions image. By implementing on the PC platform with Intel i7-3770 at 3.4GHz operational frequency, our experimental results show that the offset of pupil center are both within 2 pixels for horizontal and vertical coordinates. By using visible light for tracking iris technology, its center offset of horizontal is 7 pixels, and the vertical offset is 5 pixels. Through the perspective function, the system achieves the accuracy between 0.1° and 1.46° for horizontal coordinates, and 1.09° and 2.68° for vertical coordinates for the gaze placement error. Besides, the system achieves the accuracy between 0.85° and 1.51° for horizontal coordinates, and 1.53° and 3.56° for vertical coordinates for the gaze placement error.
author2 Chih-Peng Fan
author_facet Chih-Peng Fan
Jia-Hao Wu
吳家豪
author Jia-Hao Wu
吳家豪
spellingShingle Jia-Hao Wu
吳家豪
Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
author_sort Jia-Hao Wu
title Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
title_short Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
title_full Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
title_fullStr Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
title_full_unstemmed Dual-Mode Pupil and Iris Gaze Tracking Systems for Real-Time Wearable Eye Trackers
title_sort dual-mode pupil and iris gaze tracking systems for real-time wearable eye trackers
publishDate 2016
url http://ndltd.ncl.edu.tw/handle/64851290203988761120
work_keys_str_mv AT jiahaowu dualmodepupilandirisgazetrackingsystemsforrealtimewearableeyetrackers
AT wújiāháo dualmodepupilandirisgazetrackingsystemsforrealtimewearableeyetrackers
AT jiahaowu yīngyòngyúchuāndàishìjíshíyǎndòngyídeshuāngmóshìtóngkǒngyǔhóngmóshìxiànzhuīzōngxìtǒng
AT wújiāháo yīngyòngyúchuāndàishìjíshíyǎndòngyídeshuāngmóshìtóngkǒngyǔhóngmóshìxiànzhuīzōngxìtǒng
_version_ 1718407378575032320