Allostatic Control of Persistent States in Spiking Neural Networks for Perception and Computation

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Allostatic, Dynamic, Attractors
Abstract: We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations. Allostasis is a fundamental regulatory mechanism observed in animal physiology that orchestrates responses to maintain a dynamic equilibrium in bodily needs and internal states. In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial-numerical representation. While existing neural networks can maintain persistent states, to date, there is no unified framework for dynamically controlling spatial changes in neuronal activity in response to enviromental changes. To address this, we couple a well-known allostatic microcircuit, the Hammel model, with a ring attractor, re- sulting in a Spiking Neural Network architecture that can modulate the location of the bump as a function of some reference input. This localised activity in turn is used as a perceptual belief in a simulated subitization task – a quick enumeration process without counting. We provide a general procedure to fine-tune the model and demonstrate the successful control of the bump location. We also study the response time in the model with respect to changes in parameters and compare it with biological data. Finally, we analyze the dynamics of the network to un- derstand the selectivity and specificity of different neurons to different categories present in the input. The results of this paper, particularly the mechanism for mov- ing persistent states, are not limited to numerical cognition but can be applied to a wide range of tasks involving similar representations.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11763
Loading