Towards Global Interaction Efficiency of Graph Networks

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: graph neural networks
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We introduce global interaction efficiency as a novel metric for assessing GNN performance.
Abstract: A graph inherently embodies comprehensive interactions among all its nodes when viewed globally. Hence, going beyond existing studies in long-range interactions, which focus on interactions between individual node pairs, we study the interactions in a graph through a global perspective. Traditional GNNs acquire such interactions by leveraging local connectivities through aggregations. While this approach has been prevalent, it has shown limitations, such as under-reaching, and over-squashing. In response, we introduce a global interaction perspective and propose interaction efficiency as a metric for assessing GNN performance. This metric provides a unified insight for understanding several key aspects of GNNs, including positional encodings in Graph Transformers, spectral graph filter expressiveness, over-squashing, and the role of nonlinearity in GNNs. Inspired by the global interaction perspective, we present Universal Interaction Graph Convolution, which exhibits superior interaction efficiency. This new architecture achieves highly competitive performance on a variety of graph-level learning tasks. Code is available at https://github.com/iclrsubmission-towards/UIGC.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7293
Loading