Keywords: Whole Slide Images, Multiple Instance Learning, Knowledge Distillation, Clustering
TL;DR: Divide-and-Distill (D&D) enhances Multiple Instance Learning by clustering features and distilling expert knowledge to efficiently overcome representation bottlenecks.
Abstract: Multiple Instance Learning (MIL) is widely used for Whole Slide Image classification in computational pathology, yet existing approaches suffer from a representation bottleneck where diverse patch-level features are compressed into a single slide-level embedding. We propose Divide-and-Distill (D&D), which clusters the feature space into coherent regions, trains expert models on each cluster, and distills their knowledge into a unified model. Experiments demonstrate that D&D consistently improves six state-of-the-art MIL methods in both accuracy and AUC while maintaining single-model inference efficiency.
Submission Number: 6
Loading