Finding Symmetry in Neural Network Parameter Spaces

Published: 10 Oct 2024, Last Modified: 05 Nov 2024UniRepsEveryoneRevisionsBibTeXCC BY 4.0
Track: Extended Abstract Track
Keywords: Symmetry
Abstract: Parameter space symmetries, or loss-invariant transformations, are important for understanding neural networks' loss landscape, training dynamics, and generalization. However, identifying the full set of these symmetries remains a challenge. In this paper, we formalize data-dependent parameter symmetries and derive their infinitesimal form, which enables an automated approach to discover symmetry across different architectures. Our framework systematically uncovers parameter symmetries, including previously unknown ones. We also prove that symmetries in smaller subnetworks can extend to larger networks, allowing the discovery of symmetries in small architectures to generalize to more complex models.
Submission Number: 60
Loading