Communication subspaces align with training in ANNs

Published: 23 Oct 2024, Last Modified: 24 Feb 2025NeurReps 2024 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: communication subspaces, convolutional neural networks, subspace alignment
TL;DR: We found that training aligned communication subspaces within convolutional neural networks across layers and varied connection types.
Abstract: Communication subspaces have recently been identified as a promising mechanism for selectively routing information between brain areas. In this study, we explored whether communication subspaces develop with training in artificial neural networks (ANNs) and explored differences across connection types. Specifically, we analyzed the subspace angles between activations and weights in ResNet-50 before and after training. We found that activations were more aligned to the weight layers after training, although this effect decreased in deeper layers. We also analyzed the angles between pairs of weight layers. We found that for all branching, direct, and skip connections, weight layer pairs were more geometrically aligned in trained versus untrained models throughout the entire network. These findings indicate that such alignment is essential for the proper functioning of deep networks and highlights the potential to enhance training efficiency through pre-alignment. In biological data, our results motivate further exploration into whether learning induces similar subspace alignment.
Submission Number: 30
Loading