Track: Machine learning: computational method and/or computational results
Keywords: Protein representation learning, protein language models, knowledge distillation
Abstract: Protein language models are a powerful tool for learning protein representations through pre-training on vast protein sequence datasets.
However, traditional protein language models lack explicit structural supervision, despite its relevance to protein function.
To address this issue, we introduce the integration of remote homology detection to distill structural information into protein language models without requiring explicit protein structures as input.
We evaluate the impact of this structure-informed training on downstream protein function prediction tasks.
Experimental results reveal consistent improvements in function annotation accuracy for EC number and GO term prediction. Performance on mutant datasets, however, varies based on the relationship between targeted properties and protein structures. This underscores the importance of considering this relationship when applying structure-aware training to protein function prediction tasks. Code and model weights will be made available upon acceptance.
Submission Number: 17
Loading