Topoformer: Topology-Infused Transformers for Medical Imaging

Sayoni Chakraborty, Philmore Koung, Baris Coskunuzer

Published: 27 Nov 2025, Last Modified: 09 Dec 2025ML4H 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: 3D medical imaging, transformers, topological data analysis, persistent homology, deep learning, brain tumor
TL;DR: Topoformer integrates sliding-band topological signatures with transformers, achieving state-of-the-art accuracy and efficiency for 3D medical image classification in low-data settings.
Track: Proceedings
Abstract: Deep learning has transformed 2D medical imaging, but scaling to 3D volumes remains difficult due to high compute, scarce annotations, and the loss of global context in patch-based pipelines. We present **Topoformer**, a transformer framework that makes 3D classification both data- and compute-efficient by integrating topological priors. First, we introduce a *sliding-band cubical filtration* that replaces a single global persistent-homology pass with overlapping intensity bands, yielding an *ordered sequence* of Betti tokens (components, tunnels, cavities). These tokens act as transformer inputs, enabling multi-scale topological reasoning without early saturation. Second, we propose *Topological Supervised Contrastive Learning* (TopoSupCon), which treats the image and its label-preserving topological view as complementary modalities, reducing reliance on brittle geometric or generative augmentations. On 3D brain MRI tumor grading and chest CT benchmarks in low-data regimes, **Topoformer** achieves consistent gains over strong 3D CNN and ViT baselines, including improvements up to 12 AUC points and 8 accuracy points. Our results show that sequential, topology-aware representations provide a powerful inductive bias for volumetric medical image analysis.
General Area: Models and Methods
Specific Subject Areas: Medical Imaging
Data And Code Availability: Yes
Ethics Board Approval: No
Entered Conflicts: I confirm the above
Anonymity: I confirm the above
Submission Number: 34
Loading