Summary: | Abstract Robot perception in dynamic confined unstructured environments is a challenging task due to unanticipated changes that take place in the surroundings. Although 3D perception sensors are able to capture terrain topology with high precision, the interim variations between collected sensor data that are caused due to the motion of moving entities with respect to the robot lead to noisy mappings of the environment. In this article, a real‐time 3D perception filter is presented that is capable of detecting and eliminating moving point clusters from the input pointcloud data collected in an indoor environment. Using LiDAR and IMU sensors the proposed mechanism can help in precise 3D pointcloud map generation in dynamic and unstructured GPS‐denied environments. In this article, a novel approach has been proposed based on the concepts of data clustering, relative motion, pointcloud change detection and confidence tracking. The novelty of this approach lies in its ability to detect within cluster movements and the proposal of a generic tracking method for handling inconsistent motion of objects typically found in indoor environments. For the detection of moving objects, the proposed mechanism does not require any prior knowledge about the target entity. For pointcloud preprocessing, a ground plane removal approach has been proposed based on voxel grid covariance along the axis normal to the ground. The approach was experimented on a humanoid robot in indoor office environments using Velodyne VLP‐16 LiDAR and Intel T265 IMU. The results show that the proposed approach is efficient in detecting indoor moving objects in real time.
|