Fusing Detected Humans in Multiple Perception Sensors Network
A fusion method is proposed to keep a correct number of humans from all humans detected by the robot operating system based perception sensor network (PSN) which includes multiple partially overlapped field of view (FOV) Kinects. To this end, the fusion rules are based on the parallel and orthogonal...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Ton Duc Thang University
2017-11-01
|
Series: | Journal of Advanced Engineering and Computation |
Online Access: | http://jaec.vn/index.php/JAEC/article/view/61 |
Summary: | A fusion method is proposed to keep a correct number of humans from all humans detected by the robot operating system based perception sensor network (PSN) which includes multiple partially overlapped field of view (FOV) Kinects. To this end, the fusion rules are based on the parallel and orthogonal configurations of Kinects in PSN system. For the parallel configuration, the system will decide whether the detected humans staying in FOV of single Kinect or in overlapped FOV of multiple Kinects by evaluating the angles formed between their locations and Kinect original point on top view (x, z plane) of 3D coordination. Then, basing on the angles, the PSN system will keep the person stay in only one FOV or keep the one with biggest ROI if they stay in overlapped FOV of Kinects. In the case of Kinects with orthogonal configuration, 3D Euclidian distances between detected humans are used to determine the group of humans supported to be same human but detected by different Kinects. Then the system, keep the human with a bigger region of interest (ROI) among this group. The experimental results demonstrate the outperforming of the proposed method in various scenarios.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
---|---|
ISSN: | 1859-2244 2588-123X |