Unify Graph Learning with Text: Unleashing LLM Potentials for Session Search

Published: 23 Jan 2024, Last Modified: 23 May 2024TheWebConf24EveryoneRevisionsBibTeX
Keywords: Session Search, Large Language Model, Heterogeneous Graph
Abstract: Session search involves a series of interactive queries and actions by a user to fulfill a complex information need. Current strategies typically prioritize sequential modeling for deep semantic understanding, often overlooking the graph structure in interactions. On the other hand, while some approaches focus on capturing structural behavior data, they use a generalized representation for documents, neglecting the nuanced word-level semantic modeling. In this paper, we propose a model named Symbolic Graph Ranker (SGR), which aims to take advantage of both text-based and graph-based approaches by leveraging the power of recent Large Language Models (LLMs). Concretely, we first introduce a method to convert graph structure data into text using symbolic grammar rules. This allows integrating session search history, interaction process, and task description seamlessly as inputs for the LLM. Moreover, given the natural discrepancy between LLMs pre-trained on textual corpora, and the symbolic text we produce using our graph-to-text grammar, our objective is to enhance LLMs’ ability to capture graph structures within a textual format. To achieve this, we introduce a set of self-supervised symbolic learning tasks including link prediction, node content generation, and generative contrastive learning, to enable LLMs to capture the topological information from coarse-grained to fine-grained. Experiment results and comprehensive analysis on two benchmark datasets, AOL and Tiangong-ST, confirm the superiority of our approach. Our paradigm also offers a novel and effective methodology that bridges the gap between traditional search strategies and modern LLM.
Track: Search
Submission Guidelines Scope: Yes
Submission Guidelines Blind: Yes
Submission Guidelines Format: Yes
Submission Guidelines Limit: Yes
Submission Guidelines Authorship: Yes
Student Author: Yes
Submission Number: 1560
Loading