SNAP: Stopping Catastrophic Forgetting in Hebbian Learning with Sigmoidal Neuronal Adaptive Plasticity
Abstract: Artificial Neural Networks (ANNs) suffer from catastrophic forgetting, where learning on new tasks causes the degradation of performance on previous ones. Existing algorithms typically use linear weight updates, where the magnitude of the update is independent of the current weight strength. This contrasts with biological neurons, which at intermediate strengths are very plastic, but consolidate with Long-Term Potentiation (LTP) once they reach a certain strength. We hypothesize that this biological mechanism could mitigate catastrophic forgetting in ANNs. We introduce Sigmoidal Neuronal Adaptive Plasticity (SNAP), an artificial approximation to Long-Term Potentiation for ANNs by having the weights follow a sigmoidal growth behavior, allowing the weights to consolidate and stabilize when they reach sufficiently extreme values. We compare SNAP to linear and exponential weight growth and see that SNAP prevents the forgetting of previous tasks for Hebbian Learning but not for Stochastic Gradient Descent (SGD) based learning.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Gunhee_Kim1
Submission Number: 6715
Loading