Fine-tuning hierarchical circuits through learned stochastic co-modulationDownload PDF

Published: 21 Oct 2022, Last Modified: 05 May 2023Attention Workshop, NeurIPS 2022 OralReaders: Everyone
Keywords: gain modulation, neural covariability, hierarchical coding, biological vision
TL;DR: Targeted stochastic co-modulation in the brain introduces a label of task-relevant information that can help fine-tune a hierarchical model of the visual system for a new task
Abstract: Attentional gating is a core mechanism supporting behavioral flexibility, but its biological implementation remains uncertain. Gain modulation of neural responses is likely to play a key role, but simply boosting relevant neural responses can be insufficient for improving behavioral outputs, especially in hierarchical circuits. Here we propose a variation of attentional gating that relies on {\em stochastic} gain modulation as a dedicated indicator of task relevance, which guides task-specific readout adaptation. We show that targeted stochastic modulation can be effectively learned and used to fine-tune hierarchical architectures, without reorganization of the underlying circuits. Simulations of such networks demonstrate improvements in learning efficiency and performance in novel tasks, relative to traditional attentional mechanisms based on deterministic gain increases. The effectiveness of this approach relies on the availability of representational bottlenecks in which the task relevant information is localized in small subpopulations of neurons. Overall, this work provides a new mechanism for constructing intelligent systems that can flexibly and robustly adapt to changes in task structure.
Financial-aid: Yes, I need financial assistance to present in-person at the workshop.
0 Replies

Loading