Enhancing diversity in language based models for single-step retrosynthesisDownload PDF

25 Jul 2022 (modified: 12 Oct 2022)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Keywords: deep learning, machine learning, NLP, chemistry, synthesis planning, AI, diversity
TL;DR: We aim at improving the diversity of retrosynthesis predictions in NLP synthesis planning, allowing for wider choices in recursive synthesis tools
Abstract: Over the past four years, several research groups demonstrated the combination of domain-specific language representation with recent NLP architectures to accelerate innovation in a wide range of scientific fields. Chemistry is a great example. Among the various chemical challenges addressed with language models, retrosynthesis demonstrates some of the most distinctive successes and limitations. Single-step retrosynthesis, the task of identifying reactions able to decompose a complex molecule into simpler structures, can be cast as a translation problem, in which a text-based representation of the target molecule is converted into a sequence of possible precursors. A common issue is a lack of diversity in the proposed disconnection strategies. The suggested precursors typically fall in the same reaction family, which limits the exploration of the chemical space. We present a retrosynthesis Transformer model that increases the diversity of the predictions by prepending a classification token to the language representation of the target molecule. At inference, the use of these prompt tokens allows us to steer the model towards different kinds of disconnection strategies. We show that the diversity of the predictions improves consistently, which enables recursive synthesis tools to circumvent dead ends and consequently, suggests synthesis pathways for more complex molecules.
0 Replies

Loading