RAG-FGO: Enhancing RAG with Fungal Growth Optimizer for LLM Agents

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Retrieval-Augmented Generation, Generative Retrieval, Large Language Models, Agent Optimization, Question Answering, Knowledge-Intensive Tasks
TL;DR: We present RAG-FGO, a search-based framework that optimizes generative retrieval agents for RAG systems, yielding superior performance on complex QA benchmarks.
Abstract: Generative retrieval leverages large language models (LLMs) to directly generate retrieval queries or candidate document representations, and has recently shown great potential in open-domain question answering and knowledge-intensive tasks. Compared to traditional index-based retrieval methods, generative retrieval provides greater flexibility in handling semantic diversity and complex task requirements. However, existing approaches often rely on static prompt designs or fixed generation strategies, which struggle to maintain both stable and efficient performance in scenarios characterized by semantic complexity, task variability, or noisy knowledge bases. To address these limitations, we propose RAG-FGO (Retrieval-Augmented Generation with Fungal Growth Optimizer), a heuristic search-based framework for optimizing dynamic generative retrieval agents in Retrieval-Augmented Generation (RAG) systems. RAG-FGO integrates both global and local search strategies within the solution space to generate more robust and effective retrieval prompts and parameters, while avoiding local optima. In addition, it introduces a query memory pool that stores high-performing prompts during iterative optimization, thereby guiding subsequent search and generation. Experimental results indicate that RAG-FGO outperforms baselines such as Direct, ReAct, and Self-Act on benchmark datasets including HotpotQA, MuSiQue, and SQuAD, confirming its effectiveness for complex generative retrieval tasks.
Primary Area: foundation or frontier models, including LLMs
Submission Number: 10735
Loading