MING: A Functional Approach to Learning Molecular Generative Models

Published: 22 Jan 2025, Last Modified: 09 Mar 2025AISTATS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Traditional molecule generation methods often rely on sequence- or graph-based representations, which can limit their expressive power or require complex permutation-equivariant architectures. This paper introduces a novel paradigm for learning molecule generative models based on functional representations. Specifically, we propose Molecular Implicit Neural Generation (MING), a diffusion-based model that learns molecular distributions in the function space. Unlike standard diffusion processes in the data space, MING employs a novel functional denoising probabilistic process, which jointly denoises information in both the function's input and output spaces by leveraging an expectation-maximization procedure for latent implicit neural representations of data. This approach enables a simple yet effective model design that accurately captures underlying function distributions. Experimental results on molecule-related datasets demonstrate MING's superior performance and ability to generate plausible molecular samples, surpassing state-of-the-art data-space methods while offering a more streamlined architecture and significantly faster generation times. The code is available at https://github.com/v18nguye/MING.
Submission Number: 458
Loading