IAFFNet: Illumination-Aware Feature Fusion Network for All-Day RGB-Thermal Semantic Segmentation of Road Scenes

Published: 01 Jan 2022, Last Modified: 13 Nov 2024IEEE Access 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Semantic segmentation based on RGB and thermal images is an effective way to achieve an all-day understanding of road scenes. However, how to fuse the RGB and thermal information effectively remains an open problem. By studying fusion strategies at different stages, an illumination-aware feature fusion network has been developed for all-day semantic segmentation of urban road scenes in this paper, called IAFFNet. At the encoding stage, we introduce a bi-directional guided feature fusion module to effectively recalibrate and unify both RGB and thermal information. At the decoding stage, we have developed adaptive fusion modules to fuse low-level details and high-level semantic information. Finally, we have developed a decision-level illumination-aware strategy to achieve robust all-day segmentation. As far as we know, we are the first to incorporate illumination clues explicitly into RGB-T semantic segmentation. Extensive experimental evaluations demonstrate that the developed method can achieve remarkable performance on public datasets compared with state-of-the-art methods. The mIoU of all-day on a public RGB-thermal urban scene dataset has achieved 56.6%.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview