Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context

碩士 === 國立臺灣大學 === 資訊工程學研究所 === 103 === With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on a real-time event detection and immediate video recording...

Full description

Bibliographic Details
Main Authors: Pei-Yun Hsu, 許培芸
Other Authors: 徐宏民
Format: Others
Language:en_US
Published: 2015
Online Access:http://ndltd.ncl.edu.tw/handle/55445343928103467019
Description
Summary:碩士 === 國立臺灣大學 === 資訊工程學研究所 === 103 === With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on a real-time event detection and immediate video recording due to the computational cost on wearable devices (e.g., Google Glass). Conventional work of video event detection analyzed video content in an offline process and it is time-consuming for visual analysis. Observing that wearable devices are usually along with sensors, we propose a novel approach for instant event detection in egocentric videos by leveraging sensor-based motion context. We compute statistics of sensor data as features. Next, we predict the user''s current motion context by a hierarchical model, and then choose the corresponding ranking model to rate the importance score of the timestamp. With importance score provided in real-time, camera on the wearable device can dynamically record micro-videos without wasting power and storage. In addition, we collected a challenging daily-life dataset called EDS (Egocentric Daily-life Videos with Sensor Data), which contains both egocentric videos and sensor data recorded by Google Glass of different subjects. We evaluate the performance of our system on the EDS dataset, and the result shows that our method outperforms other baselines.