Bundle Neural Networks for message diffusion on graphs

Published: 17 Jun 2024, Last Modified: 13 Jul 2024ICML 2024 Workshop GRaMEveryoneRevisionsBibTeXCC BY 4.0
Track: Extended abstract
Keywords: graph neural network, sheaf neural network, vector bundles
TL;DR: We propose Bundle Neural Networks, a new class Graph Neural Networks that operates via message diffusion, a continuous version of message-passing that allows to mitigate over-smoothing and over-squashing.
Abstract: The dominant paradigm for learning on graph-structured data is message passing. Despite being a strong inductive bias, the local message passing mechanism suffers from pathological issues such as over-smoothing, over-squashing, and limited node-level expressivity. To address these limitations we propose Bundle Neural Networks (BuNN), a new type of GNN that operates via *message diffusion* over *flat vector bundles* – structures analogous to connections on Riemannian manifolds that augment the graph by assigning to each node a vector space and an orthogonal map. We show that BuNNs can mitigate over-smoothing and over-squashing, and that they are universal compact uniform approximators on graphs. We showcase the strong empirical performance of BuNNs over real-world tasks, achieving state-of-the-art results on several standard benchmarks.
Submission Number: 27
Loading