A Survey on Vision-based Navigation Systems Robust to Illumination Changes

Published: 01 Jan 2022, Last Modified: 13 Nov 2024ICEIC 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper surveys visual navigation methods robust to illumination changes. Visual navigation, which involves estimating robot pose and reconstructing the surrounding environment, has been the focus of numerous research in the field of autonomous mobile vehicles. Initially, the visual navigation was divided into a localization problem and a mapping problem, and independent attempts were made to solve them individually. Gradually, due to the close dependence between the two problems, they have been integrated into the visual simultaneous localization and mapping (vSLAM) problem. vSLAM has developed from filter-based methods to optimization-based methods with high accuracy in real-time. However, such vision-based navigation systems perform data association and pose estimation, assuming that the illumination of environments does not change, which is not guaranteed in the real world. Therefore, research efforts are being made to make visual navigation systems robust to illumination changes of the external environments to be exploited in various environments. In this paper, we survey the state-of-the-art research related to visual navigation robust to illumination changes.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview