Keywords: Graph Neural Network; Graph Entropy; Message Passing Mechanism
TL;DR: A novel message passing paradigm that effectively encodes and propagates the structural distribution of node contexts.
Abstract: The Message Passing Neural Networks (MPNNs) have emerged as the dominant framework for learning on graphs. However, their expressive power is fundamentally restricted by the 1-dimensional Weisfeiler-Lehman (1-WL) test. To further improve the expressive power of MPNNs, existing methods mainly rely on higher-order or subgraph WL tests, that usually require significantly increased memory usage and computational overhead. The aim of this paper is to address the above limitations by introducing a novel message passing paradigm, that can effectively encode and propagate the structural distribution of context around each node. Instead of directly performing the message passing on $k$-tuples or subgraphs, our method encodes and propagates structural information through a compact distributional statistic, i.e., the entropy of the node context. Furthermore, we propose a kernel-based aggregation scheme to quantify the structural distribution similarities between the contexts of different nodes. Theoretical analysis and empirical evaluations indicate that the proposed framework not only achieves higher expressive power but also significantly reduces computational and memory costs.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 8449
Loading