Generalizable Diabetic Retinopathy Grading via Knowledge Constrained Concept Learning

09 Sept 2025 (modified: 07 Jan 2026)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: medical imaging, diabetic retinopathy, generalization, concept-based model
TL;DR: We propose KCCL, a framework that enables robust concept-based diabetic retinopathy grading, achieving superior domain generalization through refined concept layer knowledge distillation and regularization.
Abstract: Diabetic retinopathy (DR) grading models often suffer a significant performance drop when deployed to unseen clinical domains. A promising strategy is to mirror the diagnostic process of clinicians, who rely on identifying specific pathological signs to make judgments. Concept-based models (CBMs) are well-suited for this, but their effectiveness often hinges on concept supervision, which is rarely available in medical imaging. To address this, we propose Knowledge Constrained Concept Learning (KCCL), a novel framework that achieves robust domain generalization through concept learning under knowledge constraints. We first curate DRL6k, a dataset of 6,000 fundus images with lesion annotations, and train a lesion detection model to provide concept supervision via knowledge distillation. However, directly using this supervision may introduce noise and inconsistencies. Therefore, KCCL employs a knowledge constraint mechanism: it leverages medical priors to correct implausible concept predictions and reduce the influence of those deviating from clinical expectations during distillation, while also directly penalizing the model for producing clinically inconsistent concept predictions. Extensive experiments on multiple unseen target datasets demonstrate that KCCL significantly outperforms state-of-the-art domain generalization and DR grading methods, achieving generalization by producing clinically coherent and interpretable predictions.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 3325
Loading