Spatial-Related Correlation Network for 3D Point Clouds
Due to the irregularity and inconsistency of 3D point clouds, it is difficult to extract features directly from them. Existing methods usually extract point features independently and then use the max-pooling operation to aggregate local features, which limits the feature representation capability o...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9123391/ |
id |
doaj-639185760bf2423bb1895aecb04565fb |
---|---|
record_format |
Article |
spelling |
doaj-639185760bf2423bb1895aecb04565fb2021-03-30T02:27:46ZengIEEEIEEE Access2169-35362020-01-01811600411601210.1109/ACCESS.2020.30044729123391Spatial-Related Correlation Network for 3D Point CloudsDan Wang0https://orcid.org/0000-0002-3272-9045Guoqing Hu1https://orcid.org/0000-0001-6526-2560Chengzhi Lyu2https://orcid.org/0000-0001-9160-0324School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou, ChinaSchool of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou, ChinaSchool of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou, ChinaDue to the irregularity and inconsistency of 3D point clouds, it is difficult to extract features directly from them. Existing methods usually extract point features independently and then use the max-pooling operation to aggregate local features, which limits the feature representation capability of their models. In this work, we design a novel spatial-related correlation path, which considers both spatial information and point correlations, to preserve high dimensional features, thereby capturing fine-detail information and long-distance context of the point cloud. We further propose a new network to aggregate the spatial aware correlations with point-wise features and global features in a learnable way. The experimental results show that our method can achieve better performance than the state-of-the-art approaches on challenging datasets. We can achieve 0.934 accuracy on ModelNet40 dataset and 0.875 mean IoU (Intersection over Union) on ShapeNet dataset with only about 2.42 million parameters.https://ieeexplore.ieee.org/document/9123391/3D point cloudsfeature extractionpoint correlationsneural network |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Dan Wang Guoqing Hu Chengzhi Lyu |
spellingShingle |
Dan Wang Guoqing Hu Chengzhi Lyu Spatial-Related Correlation Network for 3D Point Clouds IEEE Access 3D point clouds feature extraction point correlations neural network |
author_facet |
Dan Wang Guoqing Hu Chengzhi Lyu |
author_sort |
Dan Wang |
title |
Spatial-Related Correlation Network for 3D Point Clouds |
title_short |
Spatial-Related Correlation Network for 3D Point Clouds |
title_full |
Spatial-Related Correlation Network for 3D Point Clouds |
title_fullStr |
Spatial-Related Correlation Network for 3D Point Clouds |
title_full_unstemmed |
Spatial-Related Correlation Network for 3D Point Clouds |
title_sort |
spatial-related correlation network for 3d point clouds |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Due to the irregularity and inconsistency of 3D point clouds, it is difficult to extract features directly from them. Existing methods usually extract point features independently and then use the max-pooling operation to aggregate local features, which limits the feature representation capability of their models. In this work, we design a novel spatial-related correlation path, which considers both spatial information and point correlations, to preserve high dimensional features, thereby capturing fine-detail information and long-distance context of the point cloud. We further propose a new network to aggregate the spatial aware correlations with point-wise features and global features in a learnable way. The experimental results show that our method can achieve better performance than the state-of-the-art approaches on challenging datasets. We can achieve 0.934 accuracy on ModelNet40 dataset and 0.875 mean IoU (Intersection over Union) on ShapeNet dataset with only about 2.42 million parameters. |
topic |
3D point clouds feature extraction point correlations neural network |
url |
https://ieeexplore.ieee.org/document/9123391/ |
work_keys_str_mv |
AT danwang spatialrelatedcorrelationnetworkfor3dpointclouds AT guoqinghu spatialrelatedcorrelationnetworkfor3dpointclouds AT chengzhilyu spatialrelatedcorrelationnetworkfor3dpointclouds |
_version_ |
1724185070420361216 |