Track: Machine learning: computational method and/or computational results
Nature Biotechnology: Yes
Keywords: drug discovery, discrete sampling
Abstract: Molecular discovery has brought great benefits to the chemical industry. Various molecule design techniques are developed to identify molecules with desirable properties. Traditional optimization methods, such as genetic algorithms,
continue to achieve state-of-the-art results across multiple molecular design
benchmarks. However, these techniques rely solely on random walk exploration, which hinders both the quality of the final solution and the convergence
speed. To address this limitation, we propose a novel approach called Gradient Genetic Algorithm (Gradient GA), which incorporates gradient information from the objective function into genetic algorithms. Instead of random exploration, each proposed sample iteratively progresses toward an optimal solution by following the gradient direction. We achieve this by designing a differentiable objective function parameterized by a neural network and utilizing
the Discrete Langevin Proposal to enable gradient guidance in discrete molecular spaces. Experimental results demonstrate that our method significantly
improves both convergence speed and solution quality, outperforming cuttingedge techniques. For example, it achieves up to a 25% improvement in the top10 score over the vanilla genetic algorithm. The code is publicly
available at https://github.com/debadyuti23/GradientGA.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Chris_Zhuang1
Format: No, the presenting author is unable to, or unlikely to be able to, attend in person.
Funding: Yes, the presenting author of this submission falls under ICLR’s funding aims, and funding would significantly impact their ability to attend the workshop in person.
Submission Number: 81
Loading