Keywords: Synaptic states, Recurrent neural networks, Hebbian plasticity, Backpropagation, Antisymmetric weights, Experimental prediction
TL;DR: Biologically-inspired neural networks with synapses switching between Hebbian (weak) and backpropagation (strong) learning reproduce observed bimodal weight distributions and outperform standard backpropagation networks.
Abstract: We explore networks whose synapses can independently change their governing plasticity rule during training. The MICrONS connectome data revealed that cortical synapses are well described by two states, corresponding to strong synapses having developed the spine apparatus and weak synapses lacking it. The spine apparatus, a calcium reservoir affecting synaptic dynamics, plays a significant role in plasticity and learning, though its exact function is not fully understood. Although the connectome data is static, synapses can dynamically gain or lose the spine apparatus. Here, with simplifying assumptions, we model a network whose synapses can switch between one of two learning rules: a weak, pre-post rule governing weak synapses (Hebbian-like), and a strong, credit-assignment rule governing strong synapses (backpropagation, BP). We explore such a system using recurrent neural networks (RNN) and contrast our plasticity-switching RNNs with vanilla, BP-only, RNNs. Surprisingly, we found that switching RNNs learn faster, i.e. with fewer examples, than vanilla RNNs. We also found that the recurrent weights matrix of the trained plasticity-switching RNNs is significantly more antisymmetric than the vanilla RNNs' matrix. This surprising prediction, considering Hebbian updates are nearly symmetric, deserves further investigation to reconcile with connectomic graph analysis.
Submission Number: 107
Loading