MPformer: Advancing Graph Modeling Through Heterophily Relationship-Based Position Encoding

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: graph transformer, mulit-hops aggregation, position encoding, heterophily
Abstract: Graph transformer model integrates the relative positional relationships among nodes into the transformer architecture, holding significant promise for modeling graph-structured data. They address certain limitations of graph neural networks (GNNs) in leveraging information from distant nodes. However, these models overlooked the representations of neighboring nodes with dissimilar labels, i.e., heterophilous relationships. This limitation inhibits the scalability of these methods from handling a wide range of real-world heterophilous datasets. To mitigate this limitation, we introduce MPformer, comprising the information aggregation module called Tree2Token and the position encoding module, HeterPos. Tree2Token aggregates node and its neighbor information at various hop distances, treating each node and its neighbor data as token vectors, and serializing these token sequences. Furthermore, for each newly generated sequence, we introduce a novel position encoding technique called HeterPos. HeterPos employs the shortest path distance between nodes and their neighbors to define their relative positional relationships. Simultaneously, it captures feature distinctions between neighboring nodes and ego-nodes, facilitating the incorporation of heterophilous relationships into the Transformer architecture. We validate the efficacy of our approach through both theoretical analysis and practical experiments. Extensive experiments on various datasets demonstrate that our approach surpasses existing graph transformer models and traditional graph neural network (GNN) models.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7577
Loading