Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 106 === Although deep neural network has achieved great success in computer vision recently, the problem of determining repetitions of arbitrary periodic human actions is still challenging. The difficulties lay in varying frame length of repetitions, temporal localizat...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/k9343t |
id |
ndltd-TW-106NTU05392065 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-106NTU053920652019-07-25T04:46:48Z http://ndltd.ncl.edu.tw/handle/k9343t Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition 以神經網路整合雙串流軌跡資料用於計算任意人類動作次數 Chih-Yu Lin 林之宇 碩士 國立臺灣大學 資訊工程學研究所 106 Although deep neural network has achieved great success in computer vision recently, the problem of determining repetitions of arbitrary periodic human actions is still challenging. The difficulties lay in varying frame length of repetitions, temporal localization of human beings and different features corresponding to different motions. Moreover, the demand of human action repetition counting is rising in medical rehabilitation and sport events, etc. To address this problem, we construct a human action dataset and propose a brand new framework, Human Action Repetition Counter (HARC), which could work on arbitrary human actions with a single architecture. Our HARC learns to count repetitions of human action in the time-frequency domain determined after few pilot studies. The experiments show that HARC outperforms previous counting methods on benchmarks. Additionally, we design novel learning strategies by generating effective synthetic data to pretrain our network, which can further boost the performance and reach more accurate results. We also demonstrate that our HARC is also capable of counting the periodic object motions. Our dataset, YT_Human_Segments dataset, will be publicly available which will benefit future researches. 徐宏民 2018 學位論文 ; thesis 19 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 106 === Although deep neural network has achieved great success in computer vision recently, the problem of determining repetitions of arbitrary periodic human actions is still challenging. The difficulties lay in varying frame length of repetitions, temporal localization of human beings and different features corresponding to different motions. Moreover, the demand of human action repetition counting is rising in medical rehabilitation and sport events, etc. To address this problem, we construct a human action dataset and propose a brand new framework, Human Action Repetition Counter (HARC), which could work on arbitrary human actions with a single architecture. Our HARC learns to count repetitions of human action in the time-frequency domain determined after few pilot studies. The experiments show that HARC outperforms previous counting methods on benchmarks. Additionally, we design novel learning strategies by generating effective synthetic data to pretrain our network, which can further boost the performance and reach more accurate results. We also demonstrate that our HARC is also capable of counting the periodic object motions. Our dataset, YT_Human_Segments dataset, will be publicly available which will benefit future researches.
|
author2 |
徐宏民 |
author_facet |
徐宏民 Chih-Yu Lin 林之宇 |
author |
Chih-Yu Lin 林之宇 |
spellingShingle |
Chih-Yu Lin 林之宇 Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
author_sort |
Chih-Yu Lin |
title |
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
title_short |
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
title_full |
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
title_fullStr |
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
title_full_unstemmed |
Aggregating Two-Stream Trajectory using Neural Network for Counting Arbitrary Human Action Repetition |
title_sort |
aggregating two-stream trajectory using neural network for counting arbitrary human action repetition |
publishDate |
2018 |
url |
http://ndltd.ncl.edu.tw/handle/k9343t |
work_keys_str_mv |
AT chihyulin aggregatingtwostreamtrajectoryusingneuralnetworkforcountingarbitraryhumanactionrepetition AT línzhīyǔ aggregatingtwostreamtrajectoryusingneuralnetworkforcountingarbitraryhumanactionrepetition AT chihyulin yǐshénjīngwǎnglùzhěnghéshuāngchuànliúguǐjīzīliàoyòngyújìsuànrènyìrénlèidòngzuòcìshù AT línzhīyǔ yǐshénjīngwǎnglùzhěnghéshuāngchuànliúguǐjīzīliàoyòngyújìsuànrènyìrénlèidòngzuòcìshù |
_version_ |
1719229978641432576 |