Generalized Reduction to the Isotropy for Flexible Equivariant Neural Fields

Published: 02 Mar 2026, Last Modified: 08 Mar 2026ICLR 2026 Workshop GRaM PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny paper (up to 4 pages)
Keywords: Equivariant neural networks, invariant theory, homogeneous spaces, group actions, geometric deep learning, neural fields, isotropy subgroups, orbit spaces, symmetry
TL;DR: We prove that invariant functions on heterogeneous product spaces can be simplified to invariant functions on a smaller space using a stabilizer subgroup, enabling more flexible equivariant neural field architectures.
Abstract: Many geometric learning problems require invariants on heterogeneous product spaces, i.e., products of distinct spaces carrying different group actions, where standard techniques do not directly apply. We show that, when a group $G$ acts transitively on a space $M$, any $G$-invariant function on a product space $X \times M$ can be reduced to an invariant of the isotropy subgroup $H$ of $M$ acting on $X$ alone. Our approach establishes an explicit orbit equivalence $(X \times M)/G \cong X/H$, yielding a principled reduction that preserves expressivity. We apply this characterization to Equivariant Neural Fields, extending them to arbitrary group actions and homogeneous conditioning spaces, and thereby removing the major structural constraints imposed by existing methods.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 36
Loading