IAO-SLAM: Real-time Illumination-Aware Object SLAM for Robust Perception in Low-Light Environments
Keywords: SLAM, localization, degraded illumination, object mapping
TL;DR: A real-time object-assisted framework that integrates low-light enhancement, object detection, and robust multi-modal association to enable reliable perception and mapping under degraded illumination.
Abstract: Robust perception and navigation remain challenging for visual Simultaneous Localization and Mapping (SLAM) systems in low-light environments, where degraded illumination reduces feature quality and hinders object detection. While geometry-based SLAM under low-light conditions has been explored, most existing solutions either sacrifice real-time performance or neglect the effect of poor illumination on object-level perception and association. In this paper, Illumination-Aware Object SLAM (IAO-SLAM) is proposed, an object-assisted framework explicitly designed to handle low-light scenarios. The system integrates the Zero-DCE++ network and YOLOv12 network to provide real-time low-light image enhancement and object detection. To achieve robust object association, the Adaptive Multi-modal Similarity Fusion (AMSF) strategy combines Wasserstein distance, Intersection over Union (IoU) overlap, and point-feature assistance is introduced, ensuring statistical, geometric, and motion-level consistency. Built upon ORB-SLAM3, our method jointly leverages point features and object landmarks to construct consistent maps in degraded illumination. Comprehensive experiments on the low-light processed TUM RGB-D dataset demonstrate that IAO-SLAM significantly improves localization accuracy and object reconstruction under low-light environments, while also ensuring the real-time performance of the system.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 6
Loading