Sparse Activations with Correlated Weights in Cortex-Inspired Neural Networks

Published: 20 Nov 2023, Last Modified: 06 Dec 2023CPAL 2024 (Proceedings Track) OralEveryoneRevisionsBibTeX
Keywords: Correlated weights, Biological neural network, Cortex, Neural network gaussian process, Sparse neural network, Bayesian neural network, Generalization theory, Kernel ridge regression, Deep neural network, Random neural network
Abstract: Although sparse activations are commonly seen in cortical brain circuits, the computational benefits of sparse activations are not well understood for machine learning. Recent neural network Gaussian Process models have incorporated sparsity in infinitely-wide neural network architectures, but these models result in Gram matrices that approach the identity matrix with increasing sparsity. This collapse of input pattern similarities in the network representation is due to the use of independent weight vectors in the models. In this work, we show how weak correlations in the weights can counter this effect. Correlations in the synaptic weights are introduced using a convolutional model, similar to the neural structure of lateral connections in the cortex. We show how to theoretically compute the properties of infinitely-wide networks with sparse, correlated weights and with rectified linear outputs. In particular, we demonstrate how the generalization performance of these sparse networks improves by introducing these correlations. We also show how to compute the optimal degree of correlations that result in the best-performing deep networks.
Track Confirmation: Yes, I am submitting to the proceeding track.
Submission Number: 22
Loading