EVO-RDesign: Leveraging Evolutionary Priors for Structure-Based RNA Design

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: RNA Design, RNA Tertiary Structures, Drug Development, Evolutionary Priors, RNA Language Model
TL;DR: EVO-RDesign enhances RNA sequence design using evolutionary priors from language models, achieving superior sequence recovery and zero-shot generalization compared to existing methods.
Abstract: Designing RNA sequences based on RNA tertiary structures is a crucial aspect of future RNA design with significant potential to aid drug development. Recently, deep learning-based methods have made progress in this area; however, these methods are constrained by the limited availability of RNA structural data, making it challenging to achieve optimal performance. In this paper, we propose EVO-RDesign, which leverages the evolutionary priors embedded in extensive sequence data to facilitate better RNA sequence design. Specifically, RNA language models have recently been demonstrated to learn the evolutionary information of RNA. Therefore, we consider RNA language models as repositories of evolutionary priors and design a series of adaptors that enable EVO-RDesign to retrieve these priors conditioned on the input RNA structural information. To achieve better performance, the adaptor innovatively inputs RNA structural information and outputs from existing RNA design methods into the language model. Experiments demonstrate that EVO-RDesign outperforms RDesign, achieving a 3.5% increase in sequence recovery on RNAsolo. It also exhibits zero-shot generalization, with gains of 5.1% and 4.1% in sequence recovery on RNA-Puzzles and Rfam, respectively. We also apply in-silico folding to validate whether the generated sequences can fold into the specified 3D RNA backbones.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8576
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview