DuPMAM: An Efficient Dual Perception Framework Equipped With a Sharp Testing Strategy for Point Cloud Analysis
Abstract: The challenges in point cloud analysis are primarily attributed to the irregular and unordered nature of the data. Numerous existing approaches, inspired by the Transformer, introduce attention mechanisms to extract the 3D geometric features. However, these intricate geometric extractors incur high computational overhead and unfavorable inference latency. To tackle this predicament, in this paper, we propose a lightweight and faster attention-based network, named Dual Perception MAM (DuPMAM), for point cloud analysis. Specifically, we present a novel simple Point Multiplicative Attention Mechanism (PMAM). It is implemented solely through single feed-forward fully connected layers, hence leading to lower model complexity and superior inference speed. Based on that, we further devise a dual perception strategy by constructing both a local attention block and a global attention block to learn fine-grained geometric and overall representational features, respectively. Consequently, compared to the existing approaches, our method has excellent perception of local details and global contours of the point cloud objects. In addition, we ingeniously design a Graph-Multiscale Perceptual Field (GMPF) testing strategy for model performance enhancement. It has significant advantage over the traditional voting strategy and is generally applicable to point cloud tasks, encompassing classification, part segmentation and indoor scene segmentation. Empowered by the GMPF testing strategy, DuPMAM delivers the new State-of-the-Art on the real-world dataset ScanObjectNN, the synthetic dataset ModelNet40 and the part segmentation dataset ShapeNet, and compared to the recent GB-Net, our DuPMAM trains 6 times faster and tests 2 times faster.
Loading