Point cloud classification network based on self-attention mechanism

Published: 01 Jan 2022, Last Modified: 13 Nov 2024Comput. Electr. Eng. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: PointNet makes it possible to process point cloud data directly. However, PointNet only extracts global features and cannot capture fine local features. How to build a refined local feature extractor is the main goal of the research. Recently, Transformer has been used for point cloud processing tasks with better performance than other methods. We refer to Transformer and use the self-attention mechanism to design a refined feature extractor to capture richer feature information. In addition, we get the local geometric information at different scales with a local feature extraction module and use affine transformation to convert the local features to a normal distribution. We report results on the ModelNet40 dataset, new feature extraction network greatly improves classification tasks.
Loading