From Extrapolation to Generalization: How Conditioning Transforms Symmetry Learning in Diffusion Models

Published: 23 Sept 2025, Last Modified: 29 Oct 2025NeurReps 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: diffusion models, symmetry learning, generalization, interpolation, extrapolation, equivariance, group theory, representation learning
TL;DR: Conditioning diffusion models on group elements solves symmetry extrapolation by factorizing the learning problem into a simpler generalization task.
Abstract: When trained on data with missing symmetries, diffusion models face a fundamental challenge: how can they generate samples respecting symmetries they have never observed? We prove that this failure stems from the structure of the learning problem itself. Unconditional models must satisfy a global equivariance constraint, coupling all group elements into a single optimization that requires high-dimensional data extrapolation across gaps. In contrast, conditioning on group elements factorizes this into $|\gG|$ independent problems, transforming the task into low-dimensional function generalization. Our theory predicts—and experiments confirm—that this simple change yields 5-10× error reduction on held-out symmetries. On synthetic 2D rotation tasks, conditional models maintain low error even with 300° gaps while unconditional models collapse catastrophically. We further suggest that topology-aware group embeddings may help improve this generalization by ensuring smoother functions over the group manifold.
Submission Number: 110
Loading