Scene-Aware Online Calibration of LiDAR and Cameras for Driving Systems

Published: 01 Jan 2024, Last Modified: 06 Mar 2025IEEE Trans. Instrum. Meas. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This article introduces a robust method for accurately perceiving failures and calibrating multiline light detection and ranging (LiDAR) and cameras in natural environments in an online setting. Traditional target-free calibration methods rely on matching the spatial structures of 3-D point clouds with image features. However, obtaining dense point cloud data in a short amount of time for matching and optimization is challenging in online applications. To address this, our method uses single-frame sparse LiDAR point clouds for robust feature extraction and matching, with further optimization through contextual observation. Moreover, our approach is capable of perceiving and recalibrating extrinsic errors in online natural scenes, thus enhancing the calibration’s robustness. We demonstrate the robustness and generalizability of our method using our own datasets LIVOX-Road, with evaluation results indicating subpixel accuracy. The code is released at: https://github.com/JMU-Robot/LiDAR-Camera-Online-Calibration.
Loading