Keywords: IMU/ RGB-D camera Extrinsic calibration, Sensor fusion, Ground plane segmentation, Pose estimation, RANSAC, Target-less calibration, Uncrewed aerial manipulator
Abstract: Accurate extrinsic calibration between inertial measurement units (IMUs) and RGB-D cameras is essential for trajectory estimation in multi-sensor robotic manipulators. Target-based calibration methods, such as Kalibr, are not suitable for online calibration, while state-of-the-art target-less techniques often fail to achieve high accuracy under poor illumination. This study proposes a ground plane-based extrinsic calibration method for uncrewed aerial vehicles (UAVs) and, more generally, other multi-sensor robotic systems. The proposed method employs a deep neural network to extract the floor segment from RGB images. The corresponding depth pixels are then back-projected to fit a plane. The normal of this plane, and the reliable gravity vector detection from IMU measurements, is used to estimate the extrinsic calibration parameters. The trained deep neural network achieves up to 0.96 precision and recall. Furthermore, comparative experiments demonstrate that the proposed method outperforms MATLAB’s target-based calibration toolbox. These results highlight the effectiveness of floor-segmentation-based calibration for UAV applications.
Submission Number: 28
Loading