Keywords: transformation, symmetry, diffusion, sampling, equivariance, test-time, pde, homography, deep learning
TL;DR: We propose method to invert transformations of data by sampling from a Boltzmann distribution on a Lie group.
Abstract: We study the problem of transformation inversion on general Lie groups: a datum is transformed by an unknown group element, and the goal is to recover an inverse transformation that maps it back to the original data distribution. Such unknown transformations arise widely in machine learning and scientific modeling, where they can significantly distort observations. As a key application, we focus on test-time equivariance, where the objective is to improve the robustness of pretrained neural networks to input transformations at inference time and without any (re)training. We take a probabilistic view and model the posterior over transformations as a Boltzmann distribution defined by an energy function in data space. To sample from this posterior, we introduce a diffusion process on Lie groups that keeps all updates on-manifold and only requires computations in the associated Lie algebra. Our method, Transformation-Inverting Energy Diffusion (TIED), relies on a new trivialized target-score identity that enables efficient score-based sampling of the transformation posterior. TIED naturally handles curved group geometries and rugged, multimodal energy landscapes, and it applies to a broad class of Lie groups and nonlinear actions without assuming compactness or a bi-invariant metric. Experiments on image homographies and PDE symmetries demonstrate that TIED can restore transformed inputs to the training distribution at test time, showing improved performance over strong canonicalization and sampling baselines.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 22862
Loading