Quantifying information stored in synaptic connections rather than in firing activities of neural networks
Keywords: Synaptic information storage, Coding by weights, Synaptic coding, Mutual information, Hopfield network, Distributed coding, Synergistic interactions
TL;DR: A theoretical framework for measuring how much information can be stored in arbitrary sets of synaptic weights in Hopfield networks.
Abstract: A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of synaptic connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing activities of neural networks, there does not exist a framework for quantifying information stored by a network's synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and by modeling real-world data patterns as sets of independent multivariate log-normal distributions. Specifically, we derive, analytically, the Shannon mutual information between the data and singletons, pairs and arbitrary $n$-tuples of synaptic connections within the network. Our framework corroborates well-established insights regarding pattern storage capacity and the principle of distributed coding in neural firing activities. Notably, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces a powerful, interpretable framework for quantitatively understanding information storage in the synapses of neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.
Poster Pdf: pdf
Submission Number: 13
Loading