LYNX - Lightweight Yielding Network eXpansion

ICLR 2026 Conference Submission20165 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: continual learning, incremental learning, spectral methods, singular value decomposition, parameter-efficient adaptation, vision transformer, class-incremental learning, catastrophic forgetting, lifelong learning, model compression, adapter modules, object detection, transfer learning
Abstract: Continual learning (CL) aims to equip models with the ability to acquire new knowledge from a sequence of tasks without catastrophic forgetting or excessive parameter growth. We present Lynx (Lightweight Yielding Network eXpansion), a simple yet powerful approach for parameter-efficient continual learning via spectral singular value modulation. Lynx decomposes each weight matrix of a frozen, pretrained backbone into its singular value decomposition (SVD), and, for each new mask—potentially covering multiple tasks or class groups—learns a compact scaling vector that multiplicatively modulates the singular values. The effective weights for each task are dynamically recomposed using the original, fixed U and Vᵀ factors along with the learned scaling vector, resulting in kilobyte-scale, swappable adapters with negligible inference overhead. Crucially, Lynx’s parameter count grows only with the number of masks and the rank of the backbone weights, ensuring strong scalability and flexibility. We evaluate Lynx in class-incremental continual learning scenarios on sequential splits of CIFAR-100 (10 tasks), ImageNet-R (40 tasks), and ImageNet-A (40 tasks), where Lynx achieves 91.7%, 87.4%, and 79.1% average accuracy, respectively. For object detection, Lynx attains up to 69.5 average IOU and 95.1% classification accuracy on VOC2012. Our results demonstrate that Lynx provides competitive performance, robust forgetting mitigation, and scalable adaptation, offering a spectral alternative to weight-space masking and low-rank adapters.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 20165
Loading