Latent Mixture of Symmetries for Sample-Efficient Dynamic Learning

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: dynamic learning; sample efficiency; symmetry preservation; Lie transformations; geometric deep learning
TL;DR: This paper proposes a novel geometric deep learning model to preserve the underlying mixture of symmetry transformations for sample-efficient dynamic learning
Abstract: Learning dynamics is essential for model-based control and Reinforcement Learning in systems operating in changing environments, such as robotics, autonomous vehicles, and power systems. However, limited system measurements, such as those from low-resolution meters, demand sample-efficient learning. Symmetry provides a powerful inductive bias by characterizing equivalent relations in system behavior to improve sample efficiency. While recent methods attempt to discover symmetries from data, they typically assume a single global symmetry group and treat symmetry discovery and dynamic learning as separate tasks, leading to limited expressiveness and error accumulation. In this paper, we propose the Latent Mixture of Symmetries (Latent MoS), an expressive model that captures symmetry-governed latent factors from complex dynamical measurements. Latent MoS focuses on dynamic learning while locally preserving the underlying symmetric transformations. To further capture long-range temporal equivalence, we introduce a hierarchical architecture that stacks Latent MoS blocks across multiple time scales. Numerical experiments across diverse physical systems demonstrate that Latent MoS significantly outperforms state-of-the-art baselines in interpolation and extrapolation tasks while offering interpretable latent representations suitable for future geometric and safety-critical analysis.
Supplementary Material: zip
Primary Area: Machine learning for sciences (e.g. climate, health, life sciences, physics, social sciences)
Submission Number: 20646
Loading