Keywords: machine learning for healthcare, survival analysis, graph neural networks, multimodal learning, competing risks, interpretability
TL;DR: Multi-Modal Interpretable Graph
Abstract: Survival prediction from electronic health records (EHRs) is crucial for clinical decision-making but remains challenging due to data heterogeneity, irregular sampling, and the presence of competing risks. We propose a novel multi-modal graph neural network that dynamically constructs and integrates modality-specific graphs from time-series, demographics, diagnostic codes, and radiographic text. Our hierarchical attention mechanism fuses intra- and inter-modality interactions while providing interpretable, cause-specific risk predictions. Trained end-to-end with a combination of negative log-likelihood, ranking, and structural losses, our model significantly outperforms existing survival and graph-based baselines across five real-world EHR datasets. We further demonstrate improved calibration and interpretability, highlighting its potential for robust and transparent clinical risk stratification.
Submission Number: 3
Loading