Efficient Evolutionary Search over Chemical Space with Large Language Models

Published: 17 Jun 2024, Last Modified: 17 Jul 2024ICML2024-AI4Science PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Large Language Models, Evolutionary Search, Molecule Optimization, AI for Science, Molecular generation
Abstract: Molecular discovery, when formulated as an optimization problem, presents significant computational challenges as the optimization objectives can be non-differentiable. Evolutionary Algorithms (EAs), often used to optimize black-box objectives in molecular discovery, traverse chemical space by performing random mutations and crossovers, leading to a large number of expensive objective evaluations. In this work, we ameliorate this shortcoming by incorporating chemistry-aware Large Language Models (LLMs) into EAs. We consider both commercial and open-source LLMs trained on large corpora of chemical information as crossover and mutation operations in EAs. We perform an extensive empirical study on multiple tasks involving property optimization and molecular similarity, demonstrating that the joint usage of LLMs with EAs yields superior performance over all baseline models across single- and multi-objective settings. We demonstrate that our algorithm improves both the quality of the final solution and convergence speed, thereby reducing the number of required objective evaluations.
Submission Number: 88
Loading