Abstract: Infrared and visible image fusion presents a highly effective solution for multimodal information integration. However, most fusion methods overlook the impact of low-light conditions on visible images and the imbalanced thermal radiation distribution in infrared images. An imbalanced thermal radiation distribution in fusion images can alter their statistical characteristics, resulting in degradation of visual quality and subsequent vision tasks’ performance. To address this issue, we propose infrared and visible image fusion with imbalanced thermal radiation distribution (ITDFuse), a low-light infrared and visible image fusion method. Specifically, we first design illumination reconstruction network (ILReNet) to restore illumination of the low-light source images and design DenoiseNet module to enhance the visual quality and preserve texture details from source images of the reconstructed images. Then, we design thermal radiation distribution balancing network (TRBaNet) to address the imbalanced thermal radiation distribution of infrared images. Finally, we design detection network (DetectNet) to jointly optimize detection and fusion tasks, resulting in a model capable of directly generating fusion images that facilitate detection tasks. In this way, the proposed method can generate high-quality fused images not only with bright scenes and even thermal radiation but also facilitate detection tasks. Extensive experiments show that the fused images of the proposed ITDFuse have demonstrated exceptional performance in both quantitative metrics and visualization outcomes.
Loading