AURORA: Neuro-Symbolic Continual Indexing for Evolving RAG Systems

ACL ARR 2026 January Submission6796 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Retrieval-Augmented Generation, Continual Learning, Distribution Shift, Approximate Nearest Neighbor Search, Neuro-Symbolic Retrieval, Meta-Learning, Uncertainty-Aware Routing, Product Quantization, Sparse Retrieval, Generative Retrieval
Abstract: Retrieval-Augmented Generation (RAG) systems depend on non-parametric indices to access external knowledge, yet most retrieval infrastructure assumes a stationary query document distribution after index construction. In dynamic settings involving continual knowledge updates or evolving terminology, this assumption often fails, leading to degraded retrieval performance, while full re-indexing remains computationally expensive. We propose AURORA, a neuro-symbolic framework for adapting retrieval indices under distribution shift by treating index maintenance as a few-shot continual learning problem. AURORA decouples discrete index structure from continuous metric representations, enabling efficient adaptation of neural components while preserving index topology. A lightweight Bayesian routing policy further balances stability and plasticity by dynamically selecting among adaptive neural indices and static fallbacks based on uncertainty estimates. Across dense, learned sparse (SPLADE), and generative (DSI) retrieval settings, AURORA recovers up to +26.9% Recall@10 on novel topics compared to static baselines, while adapting significantly faster than full retraining (28 ms vs. 5.1 s).
Paper Type: Long
Research Area: Retrieval-Augmented Language Models
Research Area Keywords: retrieval-augmented generation, passage retrieval, dense retrieval, document representation, continual learning, transfer learning / domain adaptation, meta learning, few-shot learning, calibration/uncertainty, robustness
Contribution Types: NLP engineering experiment, Approaches low compute settings-efficiency, Theory
Languages Studied: English
Submission Number: 6796
Loading