Density-Aware Early Fusion for Vehicle Collaborative Perception

Published: 01 Jan 2025, Last Modified: 19 May 2025IEEE Intell. Transp. Syst. Mag. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Vehicle collaborative perception enhances perception performance in autonomous driving by exchanging and fusing information among vehicles. Early fusion, utilizing raw sensor data for interaction and fusion, preserves rich scene details for high-performance environmental perception. However, transmitting raw data requires substantial bandwidth, challenging real-time perception. Optimizing perception accuracy while minimizing data transmission is critical in early fusion. This article proposes a density-aware early fusion approach to enhance vehicle collaborative perception adaptability. Initially, we investigate the point cloud density’s influence on perception accuracy and present an area density-aware module. Employing a density threshold filters redundant data, facilitating efficient data transmission and fusion. Furthermore, we propose a mixed self-attention module to enhance perceptual accuracy by effectively extracting local object features, focusing on the object area. Experimental results validate our method’s efficacy, achieving state-of-the-art 3D object detection performance on the V2XSet, OPV2V, and S2S-sim datasets. Compared to conventional early fusion, data transmission is reduced by 24%.
Loading