Keywords: Continual Learning, Adapters, Dynamic Expansion
TL;DR: This paper proposes TIDE, a continual learning method that dynamically expands model capacity only when needed to reduce forgetting and memory use.
Abstract: Catastrophic forgetting remains a core challenge in continual learning, where parameter updates for new tasks interfere destructively with knowledge acquired from previous tasks. Dynamic expansion methods mitigate forgetting by inserting task-specific adapters, but rigid growth schedules often expand capacity unnecessarily on stable tasks while failing to protect against interference that arises later within a task. We propose Timed Dynamic Expansion (TIDE), a method that stabilises expansion by creating adapters only at the moments they are needed, preventing both wasted growth and destructive forgetting. This strategy improves training stability by limiting redundant modules, reduces memory overhead by avoiding expansion on tasks that do not conflict with prior knowledge, and ensures protection when forgetting arises unpredictably. At inference, TIDE combines adapter outputs through a Fisher Information–weighted gating mechanism to route information through the adapters most critical for retention. Experiments on standard CL benchmarks demonstrate that TIDE reduces forgetting, improves long-term retention, and achieves these gains with lower parameter growth than existing expansion methods. Code is available at https://anonymous.4open.science/r/TIDE-23B3/.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 24237
Loading