Reliability-Aware Fusion for Semantic Segmentation under Sensor Degradation and Failures

Published: 26 Apr 2026, Last Modified: 26 Apr 2026RJCIA2026 LongEveryoneRevisionsCC BY 4.0
Keywords: Semantic segmentation, Multimodal fusion, Sensor degradation, Dempster-Shafer theory, Autonomous driving.
Abstract: Semantic segmentation in real-world driving scenarios is particularly challenging due to sensor degradation, failures, and changing environmental conditions. While multimodal fusion is a common solution, many existing approaches treat all modalities equally, ignoring their varying reliability across semantic classes and conditions. In this paper, we present ReCoLaF (Reliability-aware Conflict-guided Late Fusion), a novel deep fusion framework for multimodal semantic segmentation under uncertainty. ReCoLaF adaptively adjusts the contribution of each sensor modality through a two-stage weighting strategy: a learned class-wise reliability module that estimates how relevant each modality is for different semantic classes, and a conflict-based adjustment that measures local inconsistencies between modalities at the pixel level. The fusion is formulated within the Dempster-Shafer theory of evidence, providing a mathematically grounded approach to handle uncertainty and make robust predictions. We evaluate ReCoLaF on the DeLiVER (synthetic) and MUSES (real-world) datasets under diverse weather conditions and degraded sensor configurations. ReCoLaF consistently achieves higher average performance under sensor failures, highlighting the benefit of jointly modeling semantic reliability and inter-modality agreement for robust fusion in complex driving scenarios.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 18
Loading