Keywords: Oversquashing, Graph theory, Deep learning
TL;DR: GUMP uses a unitary adjacency matrix for message passing to reduce oversquashing in GNNs while preserving graph connectivity and permutation-equivariance.
Abstract: Message passing mechanism contributes to the success of GNNs in various applications, but also brings the oversquashing problem. Recent works combat oversquashing by improving the graph spectrums with rewiring techniques, disrupting the original graph connectivity, and having limited improvement on oversquashing in terms of oversquashing measure. Motivated by unitary RNN, we propose Graph Unitary Message Passing (GUMP) to alleviate oversquashing in GNNs by applying a unitary adjacency matrix for message passing. To design GUMP, a transformation is first proposed to equip general graphs with unitary adjacency matrices and keep their original graph connectivity. Then, the unitary adjacency matrix is obtained with a unitary projection algorithm, which is implemented by utilizing the intrinsic structure of the unitary adjacency matrix and allows GUMP to be permutation-equivariant. In experiments, GUMP is incorporated into various GNN architectures and the extensive results show the effectiveness of GUMP on various graph learning tasks.
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5412
Loading