Keywords: Hebbian, Neuroscience, Gradient Descent, Machine Learning
TL;DR: Hebbian dynamics can emerge from gradient descent and other learning algorithms.
Abstract: Stochastic gradient descent (SGD) is often viewed as biologically implausible, while local Hebbian rules dominate theories of synaptic plasticity in our brain. We derive and empirically demonstrate that SGD with weight decay can naturally produce Hebbian-like dynamics near stationarity, whereas injected gradient noise can flip the alignment to be anti-Hebbian. The effect holds for nearly any learning rule, even some random ones, revealing Hebbian behavior as an emergent epiphenomenon of deeper optimization dynamics during training. These results narrow the gap between artificial and biological learning and caution against treating observed Hebbian signatures as evidence against global error-driven mechanisms in our brains. For machine learning, our results shed light on how regularization and noise lead to feature-learning behaviors in the model.
Student Paper: Yes
Submission Number: 17
Loading