Track: long paper (up to 8 pages)
Keywords: Symmetry breaking, Equivariant Network
Abstract: Equivariant neural networks improve generalization by incorporating symmetry, but real-world data often breaks symmetry in complex ways. In particular, different orbits in the data may follow different symmetry patterns or degrees of breaking. Existing relaxed equivariant models use shared weights across all orbits, which is less effective in such heterogeneous settings.
We propose Data-Adaptive Relaxed Equivariant Networks (DAREN), a method that learns to adjust symmetry behavior separately for each orbit. By generating orbit-specific relaxed weights and using a gating mechanism to control symmetry strength, our model adapts to varying symmetry level across the data. Experiments on synthetic datasets show that DAREN outperforms equivariant models, relaxed models and unconstrained models, especially when symmetry varies across regions.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 31
Loading