Controlled Generation with Equivariant Variational Flow Matching

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY-NC-SA 4.0
TL;DR: We propose Variational Flow Matching for controlled generation, unifying conditional training and post hoc Bayesian inference with an equivariant formulation for molecular generation.
Abstract: We derive a controlled generation objective within the framework of Variational Flow Matching (VFM), which casts flow matching as a variational inference problem. We demonstrate that controlled generation can be implemented two ways: (1) by way of end-to-end training of conditional generative models, or (2) as a Bayesian inference problem, enabling post hoc control of unconditional models without retraining. Furthermore, we establish the conditions required for equivariant generation and provide an equivariant formulation of VFM tailored for molecular generation, ensuring invariance to rotations, translations, and permutations. We evaluate our approach on both uncontrolled and controlled molecular generation, achieving state-of-the-art performance on uncontrolled generation and outperforming state-of-the-art models in controlled generation, both with end-to-end training and in the Bayesian inference setting. This work strengthens the connection between flow-based generative modeling and Bayesian inference, offering a scalable and principled framework for constraint-driven and symmetry-aware generation.
Lay Summary: This paper presents a new way to guide AI models to generate data that follows specific rules, like making molecules with certain properties. The method, called Variational Flow Matching (VFM), works by smoothly transforming random noise into useful data. It can either be trained to follow rules from the start, or it can adjust an existing model afterward without retraining. We also make sure the model respects important symmetries—like a molecule still being the same if you rotate it. Our works well on several molecule-building tasks, is flexible, and runs efficiently.
Primary Area: Deep Learning->Generative Models and Autoencoders
Keywords: Variational Flow Matching, Conditional Generation, Equivariance, Molecular Generation
Submission Number: 15993
Loading