Abstract: Deep learning has achieved significant success in multi-modality medical image fusion (MMIF). Nevertheless, the distribution of spatial information varies across regions within a medical image. Current methods consider the medical image as a whole, leading to uneven fusion and susceptibility to artifacts in edge regions. To address this problem,we delve into regional information fusion and introduce an entropy-aware dynamic path selection network (EDPSN). Specifically, we introduce a novel edge enhancement module (EEM) to mitigate artifacts in edge regions through central concentration gradient (CCG). Additionally, an entropy-aware division (ED) module is designed to delineate the spatial information levels of distinct regions in the image through entropy convolution. Finally, a dynamic path selection (DPS) module is introduced to enable adaptive fusion of diverse spatial information regions. Experimental comparisons with some state-of-the-art image fusion methods illustrate the outstanding performance of the EDPSN in three datasets encompassing MRI-CT, MRI-PET, and MRI-SPECT. Moreover, the robustness of the proposed method is validated on the CHAOS dataset, and the clinical value of the proposed method is validated by sixteen doctors and medical students.
Loading