Track: Extended Abstract (non-archival, 4 pages)
Keywords: symmetry, equivariance, averaging
TL;DR: We prove that approximate symmetry is exponentially easier to enforce than exact symmetry via averaging
Abstract: Enforcing exact symmetry in machine learning models often yields significant gains in scientific applications, serving as a powerful inductive bias. However, recent work suggests that relying on approximate symmetry can offer greater flexibility and robustness. Despite promising empirical evidence, there has been little theoretical understanding, and in particular, a direct comparison between exact and approximate symmetry is missing from the literature. In this paper, we initiate this study by asking:
What is the cost of enforcing exact versus approximate symmetry?
To address this question, we introduce averaging complexity, a framework
for quantifying the cost of enforcing symmetry via averaging. Our main result is an exponential separation: under standard conditions, achieving exact symmetry requires linear averaging complexity, whereas approximate symmetry can be attained with only logarithmic averaging complexity.
To the best of our knowledge, this provides the first theoretical separation of these two cases, formally justifying why approximate symmetry may be preferable in practice. Beyond this, our tools and techniques may be of independent interest for the broader study of symmetries in machine learning.
Supplementary Material: zip
Submission Number: 6
Loading