Abstract: We propose an uncertainty-aware cross-modal attention framework for breast cancer classification using ultrasound imaging and structured clinical data. Our model incorporates a Clinical Knowledge Gate (CKG) that adaptively weights clinical features based on medical domain relevance, and an uncertainty-aware attention mechanism that dynamically fuses modalities based on confidence estimates. We employ contrastive learning to enhance alignment between modalities and address class imbalance through an adaptive focal loss with uncertainty regularization. Evaluated on the Breast-Lesions-USG dataset, our method outperforms traditional and deep learning baselines, achieving 90% accuracy and 98% recall. Grad-CAM visualizations confirm alignment with radiological markers, while uncertainty estimates improve interpretability and clinical trust.
External IDs:doi:10.1007/978-3-032-05559-0_22
Loading