On Fairly Comparing Group Equivariant Networks

Published: 17 Jun 2024, Last Modified: 13 Jul 2024ICML 2024 Workshop GRaMEveryoneRevisionsBibTeXCC BY 4.0
Track: Proceedings
Keywords: group, symmetry, equivariance, invariance, polytopal complex, ReLU, splines, expressivity, flexibility
TL;DR: What are the effects of incoporating equivariance on ReLU networks' polytopal complexes?
Abstract: This paper investigates the flexibility of Group Equivariant Convolutional Neural Networks (G-CNNs), which specialize conventional neural networks by encoding equivariance to group transformations. Inspired by splines, we propose new metrics to assess the complexity of ReLU networks and use them to quantify and compare the flexibility of networks equivariant to different groups. Our analysis suggests that the current practice of comparing networks by fixing the number of trainable parameters unfairly affords models equivariant to larger groups additional expressivity. Instead, we advocate for comparisons based on a fixed computational budget---which we empirically show results in more similar levels of network flexibility. This approach allows one to better disentangle the impact of constraining networks to be equivariant from the increased expressivity they are typically granted in the literature, enabling one to obtain a more nuanced view of the impact of enforcing equivariance. Interestingly, our experiments indicate that enforcing equivariance results in *more* complex fitted functions even when controlling for compute, despite reducing network expressivity.
Submission Number: 61
Loading