CalibRBEV: Multi-Camera Calibration via Reversed Bird's-eye-view Representations for Autonomous Driving
Abstract: Camera calibration is crucial in computer vision tasks and applications, e.g., autonomous driving (AD). However, prevailing camera calibration models pose a time-consuming and labor-intensive off-board process in mass production settings, while simultaneously lacking exploration of real-world AD scenarios. To this end, inspired by recent advancements in bird's-eye-view (BEV) perception models, this paper proposes a novel multi-camera Calibration method via Reversed BEV representations for AD, termed CalibRBEV. Specifically, the proposed CalibRBEV model primarily comprises two stages. Initially, we innovatively reverse the BEV perception pipeline, reconstructing bounding boxes through an attention auto-encoder module to fully extract the latent reversed BEV representations. Subsequently, the obtained representations from encoder are interacted with the surrounding multi-view image features for further refinement and calibration parameters prediction. Extensive experimental results on nuScenes and Waymo datasets validate the effectiveness of our proposed model.
Loading