Caduceus: Bi-Directional Equivariant Long-Range DNA Sequence Modeling

Published: 17 Jun 2024, Last Modified: 05 Jul 2024AccMLBio PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Machine Learning, Deep Learning, DNA Language Modeling
TL;DR: We present bi-directional and Reverse Complement equivariant extensions to the Mamba block and use these to propose a new DNA Language Model.
Abstract: Large-scale sequence modeling has sparked rapid advances that now extend into biology and genomics. However, modeling genomic sequences introduces challenges such as the need to model long-range token interactions, the effects of upstream and downstream regions of the genome, and the reverse complementarity (RC) of DNA. Here, we propose an architecture motivated by these challenges that builds off the long-range Mamba block, and extends it to a BiMamba component that supports bi-directionality, and to a MambaDNA block that additionally supports RC equivariance. We use MambaDNA as the basis of Caduceus, the first family of RC equivariant bi-directional long-range DNA language models, and we introduce pre-training and fine-tuning strategies that yield Caduceus DNA foundation models. Caduceus outperforms previous long-range models on downstream benchmarks; on a challenging long-range variant effect prediction task, Caduceus exceeds the performance of 10x larger models that do not leverage bi-directionality or equivariance.
Submission Number: 37
Loading