Label Dependency Aware Loss for Reliable Multi-Label Medical Image Classification

Published: 2025, Last Modified: 28 Feb 2026ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: A key challenge in multi-label classification is to model the dependencies between the labels while ensuring proper calibration, as the assumption of label independence often results in inferior classification performance and poor calibration. However, most of the earlier works that modeled label dependencies have neglected the problem of ensuring calibrated results, which is crucial in safety-critical applications like medical image analysis. In this paper, we propose a novel training loss function, Label Dependency Aware Cross Entropy (LDACE), specifically designed for capturing pairwise label dependencies during the learning process. Additionally, we introduce an auxiliary loss, Canonical Calibration Loss (CCL), which when combined with LDACE ensures better calibration. We evaluate the effectiveness of the proposed loss function through experiments on three publicly available datasets – ChestMNIST, PTB-XL and RFMiD – by comparing it against the traditional multi-label losses using various deep learning models. The classification and calibration results demonstrate the superiority of the proposed loss function over the traditional loss functions in terms of Hamming loss, Area Under the Receiver Operating Characteristic Curve (AUC), Average Calibration Error (ACE) and Maximum Calibration Error (MCE). Furthermore, the experiments also show that the auxiliary loss not only improves the calibration performance but also retains the classification performance. The code is available at https://github.com/theimageprocessingguy/LDACE-CCL.
Loading