Input Alignment along Chaotic directions increases Stability in Recurrent Neural NetworksDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Anatomical studies demonstrate that brain reformats input information to generate reliable responses for performing computations. However, it remains unclear how neural circuits encode complex spatio-temporal patterns. We show that neural dynamics are strongly influenced by the phase alignment between the input and the spontaneous chaotic activity. Input alignment along the dominant chaotic projections causes the chaotic trajectories to become stable channels (or attractors), hence, improving the computational capability of a recurrent network. Using mean field analysis, we derive the impact of input alignment on the overall stability of attractors formed. Our results indicate that input alignment determines the extent of intrinsic noise suppression and hence, alters the attractor state stability, thereby controlling the network's inference ability.
Keywords: Reservoir Models, Stability, Chaos, Input Alignment, Mean Field Analysis
TL;DR: Input Structuring along Chaos for Stability
Original Pdf: pdf
4 Replies

Loading