Abstract: Molecular discovery has brought great benefit to the chemical industry. Various molecular design techniques have been developed to identify molecules with desirable properties. Traditional optimization methods, such as genetic algorithms, continue to achieve state-of-the-art results across various molecular design benchmarks. However, these techniques rely solely on undirected random exploration, which hinders both the quality of the final solution and the convergence speed.
To address this limitation, we propose a novel approach called Gradient Genetic Algorithm (Gradient GA), which incorporates gradient information from the objective function into genetic algorithms. Instead of random exploration, each proposed sample iteratively progresses toward an optimal solution by following the gradient direction. We achieve this by designing a differentiable objective function parameterized by a neural network and utilizing the Discrete Langevin Proposal to enable gradient guidance in discrete molecular spaces.
Experimental results demonstrate that our method significantly improves both convergence speed and solution quality, outperforming cutting-edge techniques. The proposed method has shown up to a $25\%$ improvement in the Top 10 score over the vanilla genetic algorithm. The code is publicly available at https://anonymous.4open.science/r/GradientGA-DC45.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Emmanuel_Bengio1
Submission Number: 5238
Loading