Facial Expression Analysis in E-learning

碩士 === 國立東華大學 === 資訊工程學系 === 102 === This thesis proposed a vision-based e-Learning system that analyzed students’ facial expressions to acquire their learning states. Firstly, Active Shape Model was used to align and track facial feature points. Secondly, eye region, brow region, and mouth region w...

Full description

Bibliographic Details
Main Authors: Jyun-Lin Wu, 吳俊霖
Other Authors: Mau-Tsuen Yang
Format: Others
Published: 2014
Online Access:http://ndltd.ncl.edu.tw/handle/e9drsj
id ndltd-TW-102NDHU5392047
record_format oai_dc
spelling ndltd-TW-102NDHU53920472019-05-15T21:32:18Z http://ndltd.ncl.edu.tw/handle/e9drsj Facial Expression Analysis in E-learning 臉部表情分析應用於數位學習 Jyun-Lin Wu 吳俊霖 碩士 國立東華大學 資訊工程學系 102 This thesis proposed a vision-based e-Learning system that analyzed students’ facial expressions to acquire their learning states. Firstly, Active Shape Model was used to align and track facial feature points. Secondly, eye region, brow region, and mouth region were extracted from captured images. The information inside these regions were quantified to form feature vectors. Thirdly, optical flow was utilized to track head shaking and nodding actions. Subsequently, medium-level facial actions were recognized based on low-level image features. Finally, four high-level learning states were estimated using regression models which were trained by manually-marked ground truths. The proposed learning system offers teacher real-time information about students’ learning affects. As a result, teachers can arrange course curriculum and adjust teaching strategy according to students’ learning states to improve their learning interests and promote their learning motivations. Mau-Tsuen Yang 楊茂村 2014 學位論文 ; thesis 69
collection NDLTD
format Others
sources NDLTD
description 碩士 === 國立東華大學 === 資訊工程學系 === 102 === This thesis proposed a vision-based e-Learning system that analyzed students’ facial expressions to acquire their learning states. Firstly, Active Shape Model was used to align and track facial feature points. Secondly, eye region, brow region, and mouth region were extracted from captured images. The information inside these regions were quantified to form feature vectors. Thirdly, optical flow was utilized to track head shaking and nodding actions. Subsequently, medium-level facial actions were recognized based on low-level image features. Finally, four high-level learning states were estimated using regression models which were trained by manually-marked ground truths. The proposed learning system offers teacher real-time information about students’ learning affects. As a result, teachers can arrange course curriculum and adjust teaching strategy according to students’ learning states to improve their learning interests and promote their learning motivations.
author2 Mau-Tsuen Yang
author_facet Mau-Tsuen Yang
Jyun-Lin Wu
吳俊霖
author Jyun-Lin Wu
吳俊霖
spellingShingle Jyun-Lin Wu
吳俊霖
Facial Expression Analysis in E-learning
author_sort Jyun-Lin Wu
title Facial Expression Analysis in E-learning
title_short Facial Expression Analysis in E-learning
title_full Facial Expression Analysis in E-learning
title_fullStr Facial Expression Analysis in E-learning
title_full_unstemmed Facial Expression Analysis in E-learning
title_sort facial expression analysis in e-learning
publishDate 2014
url http://ndltd.ncl.edu.tw/handle/e9drsj
work_keys_str_mv AT jyunlinwu facialexpressionanalysisinelearning
AT wújùnlín facialexpressionanalysisinelearning
AT jyunlinwu liǎnbùbiǎoqíngfēnxīyīngyòngyúshùwèixuéxí
AT wújùnlín liǎnbùbiǎoqíngfēnxīyīngyòngyúshùwèixuéxí
_version_ 1719115716223827968