Multi-Modal Foundation Models Induce Interpretable Molecular Graph Languages

27 Sept 2024 (modified: 01 Dec 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: multimodal foundation models, molecular design, interpretability
Abstract:

Recently, domain-specific languages (DSLs) for molecular generation have shown advantages in data-efficiency and interpretability. However, constructing such a DSL requires human expertise or significant computational costs. Multi-modal foundation models (MMFMs) have shown remarkable in-context abilities for tasks across vision and text domains, but not graphs. We explore an unconventional solution: we render the molecule as an image, describe it using text, and cast the DSL construction into an equivalent problem of constructing a tree decomposition for the molecular graph. The MMFM performs a chain of discrete decisions to replace traditional heuristics used within the execution of the decomposition, enabling the smooth integration of its prior knowledge without overstepping the limits of the soundness of the algorithm. Furthermore, we collect MMFM’s reasoning for each decision into a design story, have non-expert agents evaluate stories for correctness and persuasiveness, and close the feedback loop to improve the DSL. Our method, Foundation Molecular Grammar (FMG), demonstrates significant advantages in synthesizability, diversity, and data-efficiency on molecule generation benchmarks. Moreover, its compelling chemical interpretability offers built-in transparency over the molecular discovery workflow, paving the way for additional feedback and oversight.

Supplementary Material: pdf
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12433
Loading