Climbing the label tree: Hierarchy-preserving contrastive learning for medical imaging

ICLR 2026 Conference Submission25134 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: hierarchy-preserving contrastive learning, medical imaging, self-supervised learning, taxonomy-aware representations, euclidean embeddings, hyperbolic embeddings, prototype margin, hierarchical metrics, hf1, h-acc, breast histopathology, representation learning
Abstract: Medical image labels are often organized by taxonomies (organ → tissue → subtype), yet standard self-supervised learning (SSL) ignores this structure. We present a hierarchy-preserving contrastive framework that makes the label tree a first-class training signal and an evaluation target. Our approach introduces two plug-in objectives: Hierarchy-Weighted Contrastive (HWC), which scales positive/negative pair strengths by shared ancestors to promote within-parent coherence, and Level-Aware Margin (LAM), a prototype margin that separates ancestor groups across levels. The formulation is geometry-agnostic and applies to Euclidean and hyperbolic embeddings without architectural changes. Across several benchmarks, including breast histopathology, the proposed objectives consistently improve representation quality over strong SSL baselines while better respecting the taxonomy. We evaluate with metrics tailored to hierarchy faithfulness—HF1 (hierarchical F1), H-Acc (tree-distance–weighted accuracy), and parent-distance violation rate—and also report top-1 accuracy for completeness. Ablations show that HWC and LAM are effective even without curvature, and combining them yields the most taxonomy-aligned representations. Taken together, these results provide a simple, general recipe for learning medical image representations that respect the label tree—advancing both performance and interpretability in hierarchy-rich domains.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 25134
Loading