Regression Transformer: Concurrent Conditional Generation and Regression by Blending Numerical and Textual Tokens Download PDF

Published: 05 Apr 2022, Last Modified: 05 May 2023MLDD OralReaders: Everyone
Keywords: deep learning, transformers, generative modeling, regression, regression transformer, conditional molecular design, de novo design, protein language modeling
TL;DR: A Transformer that treats numbers as sequences of tokens and thus seamlessly integrates regression and conditional generation
Abstract: We report the Regression Transformer (RT), a method that abstracts regression as a conditional sequence modeling problem. The RT casts continuous properties as sequences of numerical tokens and encodes them jointly with conventional tokens. This yields a dichotomous model that concurrently excels at regression tasks and property-driven conditional generation. can seamlessly transition between solving regression tasks and conditional generation tasks; solely governed by the mask location. We propose several extensions to the XLNet objective and adopt an alternating training scheme to concurrently optimize property prediction and conditional text generation with on a self-consistency loss. Our experiments on both chemical and protein languages demonstrate that the performance of traditional regression models can be surpassed despite training with cross entropy loss. Importantly, priming the same model with continuous properties yields a highly competitive conditional generative models that outperforms specialized approaches in a constrained property optimization benchmark. In sum, the Regression Transformer opens the door for "swiss army knife" models that excel at both regression and conditional generation. This finds application particularly in property-driven, local exploration of the chemical or protein space. The code to reproduce all experiments of the paper is available at: https://anonymous.4open.science/r/regression-transformer/
0 Replies

Loading