Keywords: Knowledge Distillation, Robustness, Explainability, Fetal Ultrasound Classi- fication
TL;DR: Knowledge Distillation appoaches to reduce PCBM complexity but maintain performance
Abstract: Deep learning has shown strong performance in fetal ultrasound standard-plane classifica-
tion and quality assessment, but deployment in clinical settings remains limited by slow
inference and limited interpretability. Progressive Concept Bottleneck Models (PCBM) ad-
dress interpretability by predicting anatomical and property concepts prior to classification,
yet their computational complexity hinders real-time use.
We investigate knowledge distillation as a means to obtain lightweight PCBM-based
student models that preserve diagnostic behaviour while improving deployability. Two
strategies are evaluated: MiniPCBM, a reduced-capacity concept-based student that
maintains hierarchical concept reasoning, and MicroPCBM, a pure classification student
trained using softened teacher logits. We further analyse robustness in terms of cross-
domain generalisation and stability of model explanations.
Across three datasets — a private clinical cohort and two public sets from Barcelona
and Africa — MiniPCBM achieves the strongest in-distribution performance on the Den-
mark dataset (7-class accuracy: 0.93 vs. PCBM 0.90), while MicroPCBM provides com-
petitive accuracy (0.90) with a ≥30× reduction in parameter count. Both distilled models
deliver substantial efficiency gains, reducing inference latency from 42.91,ms (PCBM) to
10.58,ms and 6.61,ms, respectively, supporting real-time guidance. These results show that
structured domain knowledge from PCBM can be effectively compressed into lightweight
architectures, enabling interpretable and deployable fetal ultrasound AI for point-of-care
use
Primary Subject Area: Detection and Diagnosis
Secondary Subject Area: Interpretability and Explainable AI
Registration Requirement: Yes
Visa & Travel: No
Read CFP & Author Instructions: Yes
Originality Policy: Yes
Single-blind & Not Under Review Elsewhere: Yes
LLM Policy: Yes
Submission Number: 252
Loading