Graph Diffusion Transformers for Multi-Conditional Molecular Generation

Published: 25 Sept 2024, Last Modified: 06 Nov 2024NeurIPS 2024 oralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Diffusion Transformers, Inverse Molecular Design, Molecular Generation, Polymer Generation
TL;DR: We introduce Graph Diffusion Transformers for multi-conditional polymer and small molecule inverse design. It offers predictor-free guidance by learning the representations of categorical and numerical properties, enabling accurate denoising.
Abstract: Inverse molecular design with diffusion models holds great potential for advancements in material and drug discovery. Despite success in unconditional molecule generation, integrating multiple properties such as synthetic score and gas permeability as condition constraints into diffusion models remains unexplored. We present the Graph Diffusion Transformer (Graph DiT) for multi-conditional molecular generation. Graph DiT has a condition encoder to learn the representation of numerical and categorical properties and utilizes a Transformer-based graph denoiser to achieve molecular graph denoising under conditions. Unlike previous graph diffusion models that add noise separately on the atoms and bonds in the forward diffusion process, we propose a graph-dependent noise model for training Graph DiT, designed to accurately estimate graph-related noise in molecules. We extensively validate the Graph DiT for multi-conditional polymer and small molecule generation. Results demonstrate our superiority across metrics from distribution learning to condition control for molecular properties. A polymer inverse design task for gas separation with feedback from domain experts further demonstrates its practical utility. The code is available at https://github.com/liugangcode/Graph-DiT.
Supplementary Material: zip
Primary Area: Machine learning for physical sciences (for example: climate, physics)
Submission Number: 2570
Loading