FR-LoRA: Fisher Regularized LoRA for Multilingual Continual Learning

Published: 10 Nov 2025, Last Modified: 24 Jan 2026CIKM 2025EveryoneCC BY 4.0
Abstract: Relevance in e-commerce product search is critical to ensuring that results accurately reflect customer intent. While large language models (LLMs) have recently advanced natural language processing capabilities, their high inference latency and significant infrastructure demands make them less suitable for real-time e-commerce applications. Consequently, transformer-based encoder models are widely adopted for relevance classification tasks. These models typically evaluate the relevance of a product to a given query by encoding the query and product title as input features. As e-commerce stores expand into new marketplaces, the need for language- and region-specific relevance models grows, often resulting in the sequential development and maintenance of separate models per marketplace. To address this challenge, we introduce a multilingual continual learning (CL) framework that mitigates catastrophic forgetting. Our proposed method, FR-LoRA (Fisher Regularized LoRA), integrates Elastic Weight Consolidation (EWC) with marketplace-specific LoRA modules, where each LoRA is regularized using the Fisher information matrix. FR-LoRA retains the same inference-time footprint as the base model, ensuring zero additional latency while enabling frequent, scalable updates. Empirically, our approach achieves a ~3% ROC-AUC improvement over single-marketplace baselines and outperforms several recent CL baselines on both proprietary and public datasets.
Loading