Keywords: graph neural networks, variance preserving pooling, expressivity, signal propagation
Abstract: The successful graph neural networks (GNNs) and particularly message passing neural networks critically depend on the functions employed for message aggregation and graph-level readout. Using signal propagation theory, we propose a variance-preserving aggregation function, which maintains the expressivity of GNNs while improving learning dynamics. Our results could pave the way towards normalizer-free or self-normalizing GNNs.
Submission Number: 77
Loading