Generative Model for Small Molecules with Latent Space RL Fine-Tuning to Protein Targets

Published: 17 Jun 2024, Last Modified: 03 Jul 2024AccMLBio PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Small Molecules Generation, Large Language Models, Reinforcement Learning
TL;DR: We present a novel generative latent-variable transformer model for small molecules generation and demonstrate that the model's latent space can be fine-tuned to generate hit molecules for specific protein targets.
Abstract: A specific challenge with deep learning approaches for molecule generation is generating both syntactically valid and chemically plausible molecular string representations. To address this, we propose a novel generative latent-variable transformer model for small molecules that leverages a recently proposed molecular string representation called SAFE. We introduce a modification to SAFE to reduce the number of invalid fragmented molecules generated during training and use this to train our model. Our experiments show that our model can generate novel molecules with a validity rate $>$ 90\% and a fragmentation rate $<$ 1\% by sampling from a latent space. By fine-tuning the model using reinforcement learning to improve molecular docking, we significantly increase the number of hit candidates for five specific protein targets compared to the pre-trained model, nearly doubling this number for certain targets. Additionally, our top 5\% mean docking scores are comparable to the current state-of-the-art (SOTA), and we marginally outperform SOTA on three of the five targets.
Submission Number: 23
Loading