Abstract: Continual learning (CL) requires models to learn tasks sequentially, yet deep neural networks often suffer from plasticity loss and poor knowledge transfer, which can impede their long-term adaptability.
Drawing high-level inspiration from global neuromodulatory mechanisms in the brain, we introduce $\textbf{Neu}$ro$\textbf{Mo}$dulation and $\textbf{Sync}$hronization ($\texttt{NeuMoSync}$), a novel architecture that integrates dynamic, neuron-specific modulation into deep neural networks to enhance their adaptability and plasticity.
$\texttt{NeuMoSync}$ extends standard neural network architectures with learnable feature vectors per neuron that tracks network-wide historical context and incorporates a module operating at a higher level of abstraction.
This module synthesizes neuron-specific signals, conditioned on both current inputs and the network’s evolving state, to adaptively regulate activation dynamics and synaptic plasticity. Evaluated on diverse CL benchmarks, including memorization (Random Label CIFAR-10, Random Label MNIST), concept drift (Shuffle CIFAR-10), class-incremental (Class Split T-ImageNet, CIFAR-100) and domain-incremental (Permuted MNIST), $\texttt{NeuMoSync}$ demonstrates strong performance in terms of retention of plasticity and achieves improvements in both forward and backward adaptation compared to existing methods.
Ablation studies validate the necessity of each component, while the analysis of the learned modulatory signals reveals interpretable coordination patterns across tasks. Our work underscores the potential of integrating global coordination mechanisms into deep learning systems to advance robust, adaptive continual learning.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Weiyang_Liu1
Submission Number: 7753
Loading