Efficient molecular conformer generation with SO(3) averaged flow-matching and reflow

ICLR 2025 Conference Submission12252 Authors

27 Sept 2024 (modified: 26 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Flow-matching, few-shot generation, equivariance, small molecules
TL;DR: We present a novel flow-matching objective and combine it with distillation for highly efficient molecular conformer generation
Abstract: Molecular conformer generation is a critical task in computational chemistry and drug discovery. Diverse generative deep learning methods have been proposed and shown to outperform traditional cheminformatics tools. State-of-the-art models leverage neural transport, employing denoising diffusion or flow-matching to generate or refine atomic point clouds from a prior distribution. Still, sampling with existing models requires significant computational expense. In this work, we build upon flow-matching and propose two mechanisms for accelerating training and inference of 3D molecular conformer generation. For fast training, we introduce the SO(3)-Averaged Flow, which we show to converge faster and generate better conformer ensembles compared to flow-matching and Kabsch alignment-based optimal transport flow. For fast inference, we further show that reflow methods and distillation of these models enable few-steps or even one-step molecular conformer generation with high quality. Using these two techniques, we demonstrate a model that can match the performance of strong transformer baselines with only a fraction of the number of parameters and generation steps. The training techniques proposed in this work lay the foundation for highly efficient molecular conformer generation with generative deep learning model.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12252
Loading