A Fast and Effective Alternative to Graph Transformers

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph Transformer, Graph Neural Networks, Graph Representation Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a fast and effective alternative to Graph Transformers that leverages shallow neighborhood propagation and long convolutions to effectively capture local and global dependencies.
Abstract: Graph Neural Networks (GNNs) have shown impressive performance in graph representation learning. However, GNNs face challenges in capturing long-range dependencies that limit their expressive power. To tackle this challenge, Graph Transformers (GTs) were introduced that utilize the self-attention mechanism to effectively model pairwise node relationships. Despite their advantages, GTs typically suffer from quadratic complexity with respect to the number of nodes in a graph, hindering their applicability to large graph datasets. In this work, we present Graph-Enhanced Contextual Operator (GECO), a fast and effective alternative to GTs that leverages shallow neighborhood propagation and global convolutions to effectively capture local and global dependencies. Evaluations on an extensive collection of benchmarks showcase that GECO consistently achieves superior or comparable quality compared to the existing GTs across graphs of various types and scales, improving the SOTA up to 4.5%. Remarkably, these accomplishments are realized while maintaining quasilinear time and memory scaling, making GECO a promising solution for large-scale graph representation learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6167
Loading