Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 103 === With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on a real-time event detection and immediate video recording...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2015
|
Online Access: | http://ndltd.ncl.edu.tw/handle/55445343928103467019 |
id |
ndltd-TW-103NTU05392121 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-103NTU053921212016-11-19T04:09:57Z http://ndltd.ncl.edu.tw/handle/55445343928103467019 Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context 利用感測器資料分析運動狀態進行第一人稱影片即時事件偵測 Pei-Yun Hsu 許培芸 碩士 國立臺灣大學 資訊工程學研究所 103 With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on a real-time event detection and immediate video recording due to the computational cost on wearable devices (e.g., Google Glass). Conventional work of video event detection analyzed video content in an offline process and it is time-consuming for visual analysis. Observing that wearable devices are usually along with sensors, we propose a novel approach for instant event detection in egocentric videos by leveraging sensor-based motion context. We compute statistics of sensor data as features. Next, we predict the user''s current motion context by a hierarchical model, and then choose the corresponding ranking model to rate the importance score of the timestamp. With importance score provided in real-time, camera on the wearable device can dynamically record micro-videos without wasting power and storage. In addition, we collected a challenging daily-life dataset called EDS (Egocentric Daily-life Videos with Sensor Data), which contains both egocentric videos and sensor data recorded by Google Glass of different subjects. We evaluate the performance of our system on the EDS dataset, and the result shows that our method outperforms other baselines. 徐宏民 2015 學位論文 ; thesis 14 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 103 === With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on a real-time event detection and immediate video recording due to the computational cost on wearable devices (e.g., Google Glass). Conventional work of video event detection analyzed video content in an offline process and it is time-consuming for visual analysis. Observing that wearable devices are usually along with sensors, we propose a novel approach for instant event detection in egocentric videos by leveraging sensor-based motion context. We compute statistics of sensor data as features. Next, we predict the user''s current motion context by a hierarchical model, and then choose the corresponding ranking model to rate the importance score of the timestamp. With importance score provided in real-time, camera on the wearable device can dynamically record micro-videos without wasting power and storage. In addition, we collected a challenging daily-life dataset called EDS (Egocentric Daily-life Videos with Sensor Data), which contains both egocentric videos and sensor data recorded by Google Glass of different subjects. We evaluate the performance of our system on the EDS dataset, and the result shows that our method outperforms other baselines.
|
author2 |
徐宏民 |
author_facet |
徐宏民 Pei-Yun Hsu 許培芸 |
author |
Pei-Yun Hsu 許培芸 |
spellingShingle |
Pei-Yun Hsu 許培芸 Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
author_sort |
Pei-Yun Hsu |
title |
Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
title_short |
Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
title_full |
Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
title_fullStr |
Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
title_full_unstemmed |
Real-Time Instant Event Detection in Egocentric Videos by Leveraging Sensor-Based Motion Context |
title_sort |
real-time instant event detection in egocentric videos by leveraging sensor-based motion context |
publishDate |
2015 |
url |
http://ndltd.ncl.edu.tw/handle/55445343928103467019 |
work_keys_str_mv |
AT peiyunhsu realtimeinstanteventdetectioninegocentricvideosbyleveragingsensorbasedmotioncontext AT xǔpéiyún realtimeinstanteventdetectioninegocentricvideosbyleveragingsensorbasedmotioncontext AT peiyunhsu lìyònggǎncèqìzīliàofēnxīyùndòngzhuàngtàijìnxíngdìyīrénchēngyǐngpiànjíshíshìjiànzhēncè AT xǔpéiyún lìyònggǎncèqìzīliàofēnxīyùndòngzhuàngtàijìnxíngdìyīrénchēngyǐngpiànjíshíshìjiànzhēncè |
_version_ |
1718395012840947712 |