Recognizing Indoor Human Activity in Canonical Space

碩士 === 國立交通大學 === 電機與控制工程系所 === 93 === Human activity recognition from video streams has a wide range of application such as human-machine interface, security surveillance, home care system, etc. In video processing, the size of image sequence is usually extremely large so that the human activity is...

Full description

Bibliographic Details
Main Authors: Sheng - Tien Cho, 卓聖田
Other Authors: JYH-YEONG CHANG
Format: Others
Language:en_US
Published: 2005
Online Access:http://ndltd.ncl.edu.tw/handle/73d8b7
Description
Summary:碩士 === 國立交通大學 === 電機與控制工程系所 === 93 === Human activity recognition from video streams has a wide range of application such as human-machine interface, security surveillance, home care system, etc. In video processing, the size of image sequence is usually extremely large so that the human activity is difficult to recognize. Therefore, data transformation is usually taken such as principle component analysis, wavelet, etc. The objective of this thesis is to provide a human-like system to auto-surveillance and to track people and identify their activities. We present a system for video-based human activity recognition by transforming the images into canonical space. In our system, foreground subject is first extracted as the binary image by a statistical background model using frame ratio which is robust to illumination change, and then transformed by eigenspace and canonical space transformation, and recognition is done in canonical space. By using several essential templates to represent an activity, our proposed system can recognize the activity of the subject by down sampling the image sequence instead of all consecutive image frames in order to reduce the recognition complexity, decrease the computational load, and improve the recognition performance. Without referring any geographic information such as location, path, and velocity of the subject, our proposed system uses only the binary images of subject to recognize the activity and works very well.