GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural Networks

Published: 19 Mar 2024, Last Modified: 16 Apr 2024Tiny Papers @ ICLR 2024 PresentEveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph neural networks, variance preserving pooling, expressivity, signal propagation
Abstract: The successful graph neural networks (GNNs) and particularly message passing neural networks critically depend on the functions employed for message aggregation and graph-level readout. Using signal propagation theory, we propose a variance-preserving aggregation function, which maintains the expressivity of GNNs while improving learning dynamics. Our results could pave the way towards normalizer-free or self-normalizing GNNs.
Submission Number: 77
Loading