Input correlations impede suppression of chaos and learning in balanced firing-rate networksDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 14 May 2023PLoS Comput. Biol. 2022Readers: Everyone
Abstract: Author summary Information in the brain is processed by a deeply-layered structure of local recurrent neural circuits. Recurrent neural networks often exhibit spontaneous irregular activity patterns that arise generically through the disordered interactions between neurons. Understanding under which conditions one circuit can control the activity patterns in another circuit and suppress spontaneous, chaotic fluctuations is crucial to unravel information flow and learning input-output tasks. Here we find that when different neurons receive identical input, a larger input modulation amplitude is necessary to suppress chaos and facilitate learning in balanced firing-rate networks compared to different neurons receiving distinct input. This counterintuitive behavior is explained by a dynamic cancellation of common external input by recurrent currents—a feature previously described in balanced networks of binary neurons. We systematically study the scaling of this effect with different network parameters, describe high- and low-frequency limits analytically, and develop a novel non-stationary dynamic mean-field theory that predicts when chaos gets suppressed by correlated time-dependent input. Finally, we investigate the implications for learning in balanced firing-rate networks.
0 Replies

Loading