Fast Learning in Balanced Deep Spiking Neural Networks with Strong and Weak Synapses

15 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: excitation-inhibition balance, spiking neural networks, brain-inspired, neuroscience
TL;DR: We proposed a neuronscience-inspired model which consists a set of fixed strong connections for E/I balance maintenance, and weak connections for trainable neural computation and demonstrated its computational advantages.
Abstract: The intricate neural dynamics of the cerebral cortex are often characterized in terms of the delicate balance between excitation and inhibition (E-I balance). While numerous studies have delved into its functional implications, one fundamental issue has remained unresolved -- namely, _the unstructured, random connections posed by E-I balance dynamics versus the necessity for structured neural connections to fulfill specific computational tasks_. This raises the crucial question: How can neural circuits reconcile these seemingly contradictory demands? Drawing inspirations from recent data in neuroscience, we propose a biologically grounded spiking neural network. This network incorporates two distinct sets of synaptic connections, one featuring strong synapses dedicated to maintaining the balance condition, and the other comprising weak synapses utilized for neural computation. Crucially, only the weak synapses undergo training, while the strong synapses remain fixed. Interestingly, we have discovered that this architecture not only resolves the structural conflicts, but also offers several compelling computational advantages. Firstly, the E-I balance dynamics mediated by strong synapses can closely mimic the function of normalization operations, effectively alleviating the internal covariate shift problem. Secondly, we have observed that weak synapses remain weak during training without any imposed constraints, thus preserving the balance condition established by the strong synapses. Lastly, the coexistence of strong and weak synapses allows for a seamless transition from the "lazy" learning regime, characterized by the primary training of readout weights, to the "rich" learning regime, marked by alterations in neural representations. We believe this study can shed light on how structured computations can coexist with unstructured E-I balance dynamics and offer novel perspectives on the computational advantages of E-I balance.
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 155
Loading