SLAM in Low-Light Environments Based on Infrared-Visible Light Fusion

Published: 01 Jan 2024, Last Modified: 13 Nov 2024ICCA 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Traditional visual Simultaneous Localization and Mapping (SLAM) techniques are difficult to obtain effective information in non-ideal environments such as changing light or full of smoke, which leads to the performance degradation of SLAM algorithms. To overcome the aforementioned challenges, this paper proposes a visual SLAM front-end system based on infrared-visible light fusion. The system achieves precise optimization of camera poses and map point locations in non-ideal environments by jointly optimizing the reprojection errors of visible light image point features and infrared image edge features. In addition, this article further improves the robustness of the algorithm in non-ideal environments through back-end optimization of infrared-visible light and Inertial Measurement Unit (IMU) tight coupling.
Loading