How to make the most of your masked language model for protein engineering

Published: 02 Mar 2026, Last Modified: 16 Apr 2026GEM 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: antibodies, proteins, masked language models, sampling algorithms, protein engineering
TL;DR: We propose a new approach for using masked language models for protein optimization, and report results from record-breaking in vitro benchmarking effort to evaluate models and sampling methods.
Abstract: A plethora of protein language models have been released in recent years. Yet comparatively little work has addressed how to best sample from them to optimize desired biological properties. We fill this gap by proposing a flexible, effective sampling method for masked language models (MLMs), and by systematically evaluating models and methods both in silico and in vitro on actual antibody therapeutics campaigns. Firstly, we propose sampling with stochastic beam search, exploiting the fact that MLMs are surprisingly efficient at evaluating the pseudo-perplexity of the entire 1-edit neighborhood of a sequence. Reframing generation in terms of entire-sequence evaluation enables flexible guidance with multiple optimization objectives. Secondly, we report results from our extensive in vitro head-to-head evaluation for the antibody engineering setting. This reveals that choice of sampling method is at least as impactful as the model used, motivating future research into this under-explored area.
Submission Number: 49
Loading