TL;DR: Biologically-inspired neural networks with synapses switching between Hebbian (weak) and backpropagation (strong) learning reproduce observed bimodal weight distributions and outperform standard backpropagation networks.
Abstract: We explore networks whose synapses can independently change their governing plasticity rule during training. The MICrONS connectome data revealed that cortical synapses are well described by two states, corresponding to strong synapses having developed the spine apparatus and weak synapses lacking it. The spine apparatus, a calcium reservoir affecting synaptic dynamics, plays a significant role in plasticity and learning, though its exact function is not fully understood. Although the connectome data is static, synapses can dynamically gain or lose the spine apparatus. Here, with simplifying assumptions, we model a network whose synapses can switch between one of two learning rules: a weak, pre-post rule governing weak synapses (Hebbian-like), and a strong, credit-assignment rule governing strong synapses (backpropagation, BP). We explore such a system using recurrent neural networks (RNN) and contrast our plasticity-switching RNNs with vanilla, BP-only, RNNs. Surprisingly, we found that switching RNNs learn faster, i.e. with fewer examples, than vanilla RNNs. We also found that the recurrent weights matrix of the trained plasticity-switching RNNs is significantly more antisymmetric than the vanilla RNNs' matrix. This surprising prediction, considering Hebbian updates are nearly symmetric, deserves further investigation to reconcile with connectomic graph analysis.
Length: short paper (up to 4 pages)
Domain: methods
Author List Check: The author list is correctly ordered and I understand that additions and removals will not be allowed after the abstract submission deadline.
Anonymization Check: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and URLs that point to identifying information.
Submission Number: 12
Loading