Combining Structure and Text: Learning Representations for Reasoning on Graphs

23 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph reasoning, structure representation, text representation, GNN, PLM
TL;DR: A method for graph reasoning combining both structural and textual information.
Abstract: Effective reasoning on real-world graphs necessitates a thorough understanding and optimal utilization of structural information from graph structure and textual information corresponding to nodes and edges. Recent research has primarily focused on two paradigms: employing graph neural networks to capture structural features and utilizing language models to process textual information, respectively. While these approaches have shown impressive performance, integrating structural and textual information presents significant challenges. To be more specific, concurrently training graph neural networks and language models is particularly challenging, primarily due to the scale of real-world graphs. This paper introduces a novel framework, named CoST, tailored for graph reasoning tasks. The proposed optimization objective enables alternating training of the GNN and PLM, leading to the generation of effective text representations by the PLM model, thereby enhancing the reasoning capabilities of the GNN model. Empirical results demonstrate that CoST achieves state-of-the-art performance across representative benchmark datasets.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3161
Loading