ProteinAdapter: Adapting Pre-trained Large Protein Models for Efficient Protein Representation Learning
Keywords: Protein Representation, Structured State Space Models
TL;DR: The first Mamba-based adapter exploring the parameter efficient fine-tuning of pre-trained large protein models.
Abstract: The study of proteins is crucial in various scientific disciplines, but understanding their intricate multi-level relationships remains challenging. Inspired by the sequence and structure understanding of Large Protein Models (LPMs), we introduce a new ProteinAdapter, to efficiently transfer the broad knowledge encapsulated in multiple LPMs, e.g., ESM-1b, to task-specific insights. ProteinAdapter could largely save labor-intensive analysis on the 3D position and the amino acid order. Specifically, (1) with a modest number of additional parameters, ProteinAdapter facilitates multi-level protein representation learning by integrating both sequence and geometric structure embeddings from LPMs; (2) based on the learned embedding, we further scale up the proposed ProteinAdapter to various tasks with a unified Multi-Scale Predictor, which optimally harnesses the learned embeddings through task-specific attention. Albeit simple, the proposed method is scalable to multiple downstream tasks without bells and whistles. Extensive experiments on over 20 tasks show that ProteinAdapter outperforms state-of-the-art methods under both single-task and multi-task scenarios.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9568
Loading