Abstract: Molecular discovery has brought great benefit to the chemical industry. Various molecu-
lar design techniques have been developed to identify molecules with desirable properties.
Traditional optimization methods, such as genetic algorithms, continue to achieve state-of-
the-art results across various molecular design benchmarks. However, these techniques rely
solely on undirected random exploration, which hinders both the quality of the final solution
and the convergence speed. To address this limitation, we propose a novel approach called
Gradient Genetic Algorithm (Gradient GA), which incorporates gradient information from
the objective function into genetic algorithms. Instead of random exploration, each proposed
sample iteratively progresses toward an optimal solution by following the gradient direction.
We achieve this by designing a differentiable objective function parameterized by a neural
network and utilizing the Discrete Langevin Proposal to enable gradient guidance in discrete
molecular spaces. Experimental results demonstrate that our method significantly improves
both convergence speed and solution quality, outperforming cutting-edge techniques. The
proposed method has shown up to a 25% improvement in the Top 10 score over the vanilla ge-
netic algorithm. The code is available at https://github.com/debadyuti23/GradientGA.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/debadyuti23/GradientGA
Assigned Action Editor: ~Emmanuel_Bengio1
Submission Number: 5238
Loading