Exploring High-Order Message-Passing in Graph Transformers

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph representation learning, Transformer
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: The Transformer architecture has demonstrated promising performance on graph learning tasks. However, the existing attention mechanism used in Graph Transformers (GT) cannot capture high-order correlations that exist in complex graphs, thereby limiting their expressiveness. In this paper, we present a High-Order message-passing strategy within the Transformer architecture (HOtrans) to learn long-range, high-order relationships for graph representation. Recognizing that some nodes share similar properties, we extract communities from the entire graph and introduce a virtual node to connect all nodes in the community. Operating on the community, we adopt a three-step message-passing approach: capture the high-order information of the community into a virtual node; propagate long-range dependent information between communities; aggregate community-level representations back to graph nodes. This facilitates effective global information passing. Virtual nodes capture the high-order community information and support the long-range information passing as the bridge. We demonstrate that many existing GTs can be regarded as special cases of this framework. Our experimental results illustrate that our proposed HOtrans consistently achieves highly competitive results across several node classification tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7254
Loading