Foundation Models as Class-Incremental Learners for Dermatological Image Classification

Published: 21 Jul 2025, Last Modified: 21 Jul 2025MSB EMERGE 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Class-Incremental Learning, Continual Learning, Foundation Models, Dermatological Image Classification, Dermatology
TL;DR: Foundation Models as Class-Incremental Learners
Abstract: Class-Incremental Learning (CIL) aims to learn new classes over time without forgetting previously acquired knowledge. The emergence of foundation models (FM) pretrained on large datasets presents new opportunities for CIL by offering rich, transferable representations. However, their potential for enabling incremental learning in dermatology remains largely unexplored. In this paper, we systematically evaluate frozen FMs pretrained on large-scale skin lesion datasets for CIL in dermatological disease classification. We propose a simple yet effective approach where the backbone remains frozen, and a lightweight MLP is trained incrementally for each task. This setup achieves state-of-the-art performance without forgetting, outperforming regularization, replay, and architecture-based methods. To further explore the capabilities of frozen FMs, we examine zero-training scenarios using nearest-mean classifiers with prototypes derived from their embeddings. Through extensive ablation studies, we demonstrate that this prototype-based variant can also achieve competitive results. Our findings highlight the strength of frozen FMs for continual learning in dermatology and support their broader adoption in real-world medical applications. Our code and datasets are available here.
Submission Number: 5
Loading