GraphAgent: Exploiting Large Language Models for Interpretable Learning on Text-attributed Graphs

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: pdf
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Text-attibuted graph, Large language model, Node classification
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: This paper studies learning on text-attributed graphs, where each node is associated with a textual description. While graph neural networks (GNNs) have been widely employed for solving tasks on such graphs, they struggle with balancing between effectiveness and interpretability. Inspired by recent breakthroughs in large language models (LLMs), which have demonstrated remarkable capabilities with interpretable explanations across a variety of applications, we introduce GraphAgent. GraphAgent reframes learning on text-attributed graphs as an agent planning problem and parameterizes the agent as an LLM. This paradigm shift empowers the agent to take actions explicitly tailored for text-attributed graphs, enabling comprehensive exploration of both structural and textual features. Leveraging the expressive power of LLMs, the agent adeptly capture the intricate relationships inherent in the graph structure and textual descriptions, thereby yielding precise predictions and transparent reasoning processes. Extensive experiments conducted on various datasets underscore the effectiveness and interpretability of GraphAgent, shedding new light on the promising intersection of large language models and graph-based learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5433
Loading