Dynamic Parametric Retrieval Augmented Generation

13 Sept 2025 (modified: 05 Jan 2026)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Retrieval Augmented Generation, Hypernetwork, Test-time Parametric Knowledge Enhancement
Abstract: Retrieval-augmented generation (RAG) enhances large language models (LLMs) by injecting externally retrieved documents into the input context. It significantly increases inference costs and introduces knowledge conflicts, primarily caused by the lack of corresponding parametric knowledge in LLMs. Recently, Parametric RAG (PRAG) proposed to overcome these limitations by embedding symbolic documents into LLMs parameters, effectively reducing the inference costs and conflicts through offline training. However, PRAG needs to convert all documents into parameters in advance, which incurs high training and storage costs and renders it difficult to generalize to unseen documents. To address these challenges, we propose Dynamic Parametric RAG (DyPRAG), a novel framework that leverages a lightweight parameter translator model to efficiently convert symbolic documents into parametric knowledge online. Specifically, the parameter translator employs several linear layers to convert document embeddings into LoRA modules of feed-forward networks of LLMs directly. DyPRAG achieves test-time parametric knowledge enhancement by dynamically generating the requisite parameters, which not only reduces the inference cost and mitigates knowledge conflicts inherent in RAG, but also lowers the training and storage overhead of PRAG. Extensive experiments on multiple datasets demonstrate the effectiveness and generalization capabilities of DyPRAG. Furthermore, the combination of contextual knowledge with test-time generated parametric knowledge offers a practical and more powerful RAG paradigm which updates parametric knowledge adaptively, enables superior knowledge fusion and alleviates knowledge conflicts in real-world applications. Our code is available at https://anonymous.4open.science/r/DyPRAG_ICLR.
Supplementary Material: zip
Primary Area: foundation or frontier models, including LLMs
Submission Number: 4640
Loading