Learning to Make Generalizable and Diverse Predictions for RetrosynthesisDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We propose a new model for making generalizable and diverse retrosynthetic reaction predictions.
Abstract: We propose a new model for making generalizable and diverse retrosynthetic reaction predictions. Given a target compound, the task is to predict the likely chemical reactants to produce the target. This generative task can be framed as a sequence-to-sequence problem by using the SMILES representations of the molecules. Building on top of the popular Transformer architecture, we propose two novel pre-training methods that construct relevant auxiliary tasks (plausible reactions) for our problem. Furthermore, we incorporate a discrete latent variable model into the architecture to encourage the model to produce a diverse set of alternative predictions. On the 50k subset of reaction examples from the United States patent literature (USPTO-50k) benchmark dataset, our model greatly improves performance over the baseline, while also generating predictions that are more diverse.
Code: https://github.com/iclr-2020-retro/retro_smiles_transformer
Keywords: Chemistry, Retrosynthesis, Transformer, Pre-training, Diversity
Original Pdf: pdf
8 Replies

Loading