PointNorm: Dual Normalization is All You Need for Point Cloud Analysis
Document Type
Conference Proceeding
Publication Date
1-1-2023
Abstract
Point cloud analysis is challenging due to the irregularity of the point cloud data structure. Existing works typically employ the ad-hoc sampling-grouping operation of PointNet++, followed by sophisticated local and/or global feature extractors for leveraging the 3D geometry of the point cloud. Unfortunately, the sampling-grouping operations do not address the point cloud's irregularity, whereas the intricate local and/or global feature extractors led to poor computational efficiency. In this paper, we introduce a novel DualNorm module after the sampling-grouping operation to effectively and efficiently address the irregularity issue. The DualNorm module consists of Point Normalization, which normalizes the grouped points to the sampled points, and Reverse Point Normalization, which normalizes the sampled points to the grouped points. The proposed framework, PointNorm, utilizes local mean and global standard deviation to benefit from both local and global features while maintaining a faithful inference speed. Experiments show that we achieved excellent accuracy and efficiency on Model-Net40 classification, ScanObjectNN classification, ShapeNetPart Part Segmentation, and S3DIS Semantic Segmentation. Code is available at https://github.com/ShenZheng2000/PointNorm-for-Point-Cloud-Analysis.
Publication Title
Proceedings of the International Joint Conference on Neural Networks
DOI
10.1109/IJCNN54540.2023.10191312
Recommended Citation
Zheng, Shen; Pan, Jinqian; Lu, Changjie; and Gupta, Gaurav, "PointNorm: Dual Normalization is All You Need for Point Cloud Analysis" (2023). Kean Publications. 301.
https://digitalcommons.kean.edu/keanpublications/301