Learning to Make Generalizable and Diverse Predictions for Retrosynthesis

Anonymous

Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • TL;DR: We propose a new model for making generalizable and diverse retrosynthetic reaction predictions.
  • Abstract: We propose a new model for making generalizable and diverse retrosynthetic reaction predictions. Given a target compound, the task is to predict the likely chemical reactants to produce the target. This generative task can be framed as a sequence-to-sequence problem by using the SMILES representations of the molecules. Building on top of the popular Transformer architecture, we propose two novel pre-training methods that construct relevant auxiliary tasks (plausible reactions) for our problem. Furthermore, we incorporate a discrete latent variable model into the architecture to encourage the model to produce a diverse set of alternative predictions. On the 50k subset of reaction examples from the United States patent literature (USPTO-50k) benchmark dataset, our model greatly improves performance over the baseline, while also generating predictions that are more diverse.
  • Code: https://github.com/iclr-2020-retro/retro_smiles_transformer
  • Keywords: Chemistry, Retrosynthesis, Transformer, Pre-training, Diversity
0 Replies

Loading