Decomposition of Equivariant Maps via Invariant Maps: Application to Universal Approximation under Symmetry.

TMLR Paper2369 Authors

12 Mar 2024 (modified: 20 Mar 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In this paper, we develop a theory about the relationship between invariant and equivariant maps with regard to a group $G$. We then leverage this theory in the context of deep neural networks with group symmetries in order to obtain novel insight into their mechanisms. More precisely, we establish a one-to-one relationship between equivariant maps and certain invariant maps. This allows us to reduce arguments for equivariant maps to those for invariant maps and vice versa. As an application, we propose a construction of universal equivariant architectures built from universal invariant networks. We, in turn, explain how the universal architectures arising from our construction differ from standard equivariant architectures known to be universal. Furthermore, we explore the complexity, in terms of the number of free parameters, of our models, and discuss the relation between invariant and equivariant networks' complexity. Finally, we also give an approximation rate for $G$-equivariant deep neural networks with ReLU activation functions for finite group $G$.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Alberto_Bietti1
Submission Number: 2369
Loading