Slow and Weak Attractor Computation Embedded in Fast and Strong E-I Balanced Neural Dynamics

Published: 21 Sept 2023, Last Modified: 21 Dec 2023NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Continuous attractor neural network; Excitation inhibition balance; Brain-inspired algorithms; Object tracking;
TL;DR: Continuous attractor network dynamics and excitation-inhibition balanced dynamics can coexist in one circuit with synergistic computation benefits.
Abstract: Attractor networks require neuronal connections to be highly structured in order to maintain attractor states that represent information, while excitation and inhibition balanced networks (E-INNs) require neuronal connections to be random and sparse to generate irregular neuronal firings. Despite being regarded as canonical models of neural circuits, both types of networks are usually studied in isolation, and it remains unclear how they coexist in the brain, given their very different structural demands. In this study, we investigate the compatibility of continuous attractor neural networks (CANNs) and E-INNs. In line with recent experimental data, we find that a neural circuit can exhibit both the traits of CANNs and E-INNs if the neuronal synapses consist of two sets: one set is strong and fast for irregular firing, and the other set is weak and slow for attractor dynamics. Our results from simulations and theoretical analysis reveal that the network also exhibits enhanced performance compared to the case of using only one set of synapses, with accelerated convergence of attractor states and retained E-I balanced condition for localized input. We also apply the network model to solve a real-world tracking problem and demonstrate that it can track fast-moving objects well. We hope that this study provides insight into how structured neural computations are realized by irregular firings of neurons.
Supplementary Material: pdf
Submission Number: 3193
Loading