Probabilistic thermal stability prediction through sparsity promoting transformer representationDownload PDF

Published: 18 Nov 2022, Last Modified: 05 May 2023RobustSeq @ NeurIPS 2022 PosterReaders: Everyone
Keywords: Protein Melting Temperature, Transformer Models, Embeddings, Gaussian Processes, Regression, Sparsity promotion
TL;DR: Improvement of mean-pooling of pre-trained transformer's latent representations by learning a mask.
Abstract: Pre-trained protein language models have demonstrated significant applicability in different protein engineering task. A general usage of these pre-trained transformer models latent representation is to use a mean pool across residue positions to reduce the feature dimensions to further downstream tasks such as predicting bio-physics properties or other functional behaviours. In this paper we provide a two-fold contribution to machine learning (ML) driven drug design. Firstly, we demonstrate the power of sparsity by promoting penalization of pre- trained transformer models to secure more robust and accurate melting temperature (Tm) prediction of single-chain variable fragments with a mean absolute error of 0.23C. Secondly, we demonstrate the power of framing our prediction problem in a probabilistic framework. Specifically, we advocate for the need of adopting probabilistic frameworks especially in the context of ML driven drug design.
0 Replies

Loading