Spatiotemporal Multisensor Calibration via Gaussian Processes Moving Target Tracking

Published: 01 Jan 2021, Last Modified: 13 Nov 2024IEEE Trans. Robotics 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Robust and reliable perception of autonomous systems often relies on fusion of heterogeneous sensors, which poses great challenges for multisensor calibration. In this article, we propose a method for multisensor calibration based on Gaussian processes (GPs) estimated moving target trajectories, resulting with spatiotemporal calibration. Unlike competing approaches, the proposed method is characterized by the following: first, joint multisensor on-manifold spatiotemporal optimization framework, second, batch state estimation and interpolation using GPs, and, third, computational efficiency with O(n) complexity. It only requires that all sensors can track the same target. The method is validated in simulation and real-world experiments on the following five different multisensor setups: first, hardware triggered stereo camera, second, camera and motion capture system, third, camera and automotive radar, fourth, camera and rotating 3-D lidar, and, fifth, camera, 3-D lidar, and the motion capture system. The method estimates time delays with the accuracy up to a fraction of the fastest sensor sampling time, outperforming a state-of-the-art ego-motion method. Furthermore, this article is complemented by an open-source toolbox implementing the calibration method available at bitbucket.org/unizg-fer-lamor/calirad.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview