Cognition on Graph: Navigating Massive Knowledge Space via Cognitive Cycles and Bidirectional Graph-Text Synergy
Keywords: Retrieval-Augmented Generation, Knowledge Graphs, GraphRAG, Knowledge Base Question Answering, Large Language Model Agents
Abstract: Retrieval-Augmented Generation (RAG) has empowered Large Language Models (LLMs) to tackle knowledge-intensive tasks.
However, navigating global, heterogeneous knowledge bases (large-scale knowledge graphs and text corpora) for complex reasoning remains a challenge.
Existing methods typically employ reactive, graph-driven exploration strategies, which blindly follow graph topology without adapting to the question context or evolving exploration progress, and lack deep bidirectional synergy between graph and text.
To address these limitations, we propose CoG (Cognition on Graph), a cognitive-inspired, training-free framework for adaptive knowledge exploration.
Drawing from human problem-solving, CoG performs a continuous \textit{plan-explore-reflect} cycle, where it proactively formulates investigation plans, performs dual-source retrieval, and dynamically reflects on progress to adjust strategies.
Crucially, it establishes deep bidirectional synergy between structured graph and unstructured text, where entities extracted from text dynamically guide graph exploration to bridge knowledge gaps.
Extensive experiments on seven multi-hop QA benchmarks demonstrate that CoG significantly outperforms state-of-the-art methods while achieving superior exploration efficiency.
Our code and datasets are available at https://anonymous.4open.science/r/CoG-6ED6.
Paper Type: Long
Research Area: Retrieval-Augmented Language Models
Research Area Keywords: knowledge-augmented methods, retrieval-augmented generation, knowledge base QA, multihop QA, LLM agents
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 9177
Loading