TL;DR: We perform theoretical analysis of message passing neural networks with uncertainty on node features.
Abstract: Existing theoretical guarantees for message passing neural networks (MPNNs) assume deterministic node features. We address a more realistic scenario where noise or finite measurement precision introduces uncertainties in node feature values. First, we quantify uncertainty by propagating the moments of node-feature distributions through the MPNN architecture. To propagate moments through activation functions, we use the Taylor expansion and the pseudo-Taylor polynomial expansion. We then use the resulting node embedding distributions to analytically derive probabilistic adversarial robustness certificates for node classification tasks against L2-bounded perturbations of node features. Second, we model node features as multivariate random variables and introduce Feature Convolution Distance $FCD_p$, a pseudometric based on the Wasserstein distance. $FCD_p$ corresponds to the discriminative power of MPNNs at the node level. We show that MPNNs are globally Lipschitz continuous functions with respect to the pseudometric $FCD_p$. Using the covering number of the resulting pseudometric space, which is a subset of the Wasserstein space, we derive generalization bounds for MPNNs with uncertainties in node features. Together, these two complementary approaches---moment propagation for adversarial robustness and $FCD_p$ on the subset of the Wasserstein space for generalization---establish a unified theoretical framework that comprehensively addresses MPNN reliability under node feature uncertainty.
Code Dataset Promise: Yes
Code Dataset Url: https://github.com/moritz-laber/uncertainty-aware-mpnns
Signed Copyright Form: pdf
Format Confirmation: I agree that I have read and followed the formatting instructions for the camera ready version.
Submission Number: 1646
Loading