Uncertainty-Aware Message Passing Neural Networks

Published: 23 Sept 2025, Last Modified: 27 Oct 2025NPGML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Message Passing Neural Networks, Uncertainty Quantification, Robustness, Pseudometric, Lipschitz Continuity
TL;DR: We perform theoretical analyses of message passing neural networks with uncertainty on node features.
Abstract: Existing theoretical guarantees for message passing neural networks (MPNNs) assume deterministic node features, whereas in this work we address the more realistic setting where inherent noise or finite measurement precision leads to uncertainty in node features. We assume node features are multivariate Gaussian distributions and propagate their first and second moments through the MPNN architecture. We employ Polynomial Chaos Expansion to approximate nonlinearities, and use the resulting node embedding distributions to analytically produce probabilistic node-wise robustness certificates against $L_2$-bounded node feature perturbations. Moreover, we model node features as multivariate random variables and introduce Feature Convolution Distance, $FCD_P$, a Wasserstein distance-based pseudometric that matches the discriminative power of node-level MPNNs. We show that MPNNs are globally Lipschitz continuous functions with respect to $FCD_P$. Our framework subsumes the deterministic case via Dirac measures and provides a foundation for reasoning about algorithmic stability in MPNNs with uncertainty in node features.
Submission Number: 109
Loading