XProvence: Zero-Cost Multilingual Context Pruning for Retrieval-Augmented Generation

Youssef Mohamed, Mohamed Elhoseiny, Thibault Formal, Nadezhda Chirkova

Published: 2026, Last Modified: 01 Mar 2026CoRR 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper introduces XProvence, a multilingual zero-cost context pruning model for retrieval-augmented generation (RAG), trained on 16 languages and supporting 100+ languages through effective cross-lingual transfer. Motivated by the growing use of RAG systems across diverse languages, we explore several strategies to generalize the Provence framework-which first integrated efficient zero-cost context pruning directly into the re-ranking model-beyond English. Across four multilingual question answering benchmarks, we show how XProvence can prune RAG contexts with minimal-to-no performance degradation and outperforms strong baselines. Our model is available at https://huggingface.co/naver/xprovence-reranker-bgem3-v2.
Loading