Co-HSF: Resource-Efficient One-Shot Semi-Supervised Adaptation of Histopathology Foundation Models

Published: 01 Jan 2025, Last Modified: 01 Sept 2025AAAI Spring Symposia 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Automated analysis of histopathological samples has greatly augmented the ability of experts to perform deep phenotyping on biological samples. Current state-of-the-art (SOTA) methods for histopathology image classification rely on training deep neural networks with large annotated datasets, which can be costly to obtain. Recent studies propose to bypass annotated datasets by leveraging pre-trained foundation models (e.g. visual-language models) for zero-shot predictions. Moreover, fine-tuning these models enhances performance while requiring minimal labeled data (e.g. one-shot fine-tuning). However, one-shot fine-tuned performance of histopathology foundation models on image classification tasks is understudied. In this work, we first explore the use of semi-supervised few-shot learning (SSFSL) for fine-tuning histopathology foundation models on one-shot datasets with unlabeled samples. We find SOTA SSFSL methods improve fine-tuning performance, but their pseudo-labeling (i.e. assigning labels to unlabeled samples) strategies can increase inference times over zero-shot. We then propose a Co-filtered Histopathology Semi-Supervised Few-Shot (Co-HSF) pipeline: a dual-SSFSL (i.e. with teacher and student models) training loop followed by a co-filtering (CF) pseudo-labeling strategy to efficiently leverage unlabeled data for improved semi-supervised performance and reduced inference times. Using the National Center for Tumor Disease Colorectal Cancer Dataset (NCT-CRC-HE), we show our proposed module achieves 38.4% improvement in accuracy over zero-shot performance with only 9 labeled samples and over 53% faster inference times, while also outperforming other fine-tuning and SSFSL methods.
Loading