Towards Architectural Optimization of Equivariant Neural Networks over SubgroupsDownload PDF

Published: 07 Nov 2022, Last Modified: 05 May 2023NeurReps 2022 PosterReaders: Everyone
Keywords: equivariance, neural architecture search, geometric deep learning
TL;DR: Two theoretical contributions towards equivariance-aware architectural optimization of neural networks based on relaxation to subgroups
Abstract: Incorporating equivariance to symmetry groups in artificial neural networks (ANNs) can improve performance on tasks exhibiting those symmetries, but such symmetries are often only approximate and not explicitly known. This motivates algorithmically optimizing the architectural constraints imposed by equivariance. We propose the equivariance relaxation morphism, which preserves functionality while reparameterizing a group equivariant layer to operate with equivariance constraints on a subgroup, and the $[G]$-mixed equivariant layer, which mixes operations constrained to equivariance to different groups to enable within-layer equivariance optimization. These two architectural tools can be used within neural architecture search (NAS) algorithms for equivariance-aware architectural optimization.
8 Replies

Loading