GaCLLM: Graph-aware Convolutional Large Language Model for Recommendation

ACL ARR 2025 February Submission5124 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Leveraging auxiliary textual data can help with user profiling and item characterization in recommender systems (RSs). However, incomplete user and item descriptions limit the potential of textual information in RSs. To this end, we propose a graph-aware convolutional LLM method, eliciting LLMs to summarize from a high-order interaction graph to generate fine-grained descriptions for users and items. We focus on two challenges in this paper: 1) the incompatibility between structural graph and text-aware LLMs; and 2) the limitation of LLMs' capability for long context. To bridge the gap between graph structures and LLMs, we employ the LLM as an aggregator for graph convolution process, eliciting it to infer the graph-based knowledge iteratively. To mitigate the information overload associated with large-scale graphs, we segment the graph processing into manageable steps, progressively incorporating multi-hop information in a least-to-most manner. Experiments on three real-world datasets demonstrate that our method consistently outperforms state-of-the-art approaches.
Paper Type: Long
Research Area: Language Modeling
Research Area Keywords: large language model, recommender systems, graph convolutional networks
Contribution Types: NLP engineering experiment
Languages Studied: English, Chinese
Submission Number: 5124
Loading