Regression Transformer enables concurrent sequence regression and generation for molecular language modelling

Abstract: Transformer models are gaining increasing popularity in modelling natural language as they can produce human-sounding text by iteratively predicting the next word in a sentence. Born and Manica apply the idea of Transformer-based text completion to property prediction of chemical compounds by providing the context of a problem and having the model complete the missing information.
0 Replies
Loading