Keywords: molecular docking, protein-ligand binding, transformer, equivariance, high-throughput screening, drug discovery
TL;DR: We introduce RapidDock, a first-in-class transformer-based model capable of accurate high-throughput molecular docking.
Abstract: Accelerating molecular docking -- the process of predicting how molecules bind to protein targets -- could boost small-molecule drug discovery and revolutionize medicine. Unfortunately, current molecular docking tools are too slow to screen potential drugs against all relevant proteins, which often results in missed drug candidates or unexpected side effects occurring in clinical trials.
To address this gap, we introduce RapidDock, an efficient transformer-based model for blind molecular docking.
RapidDock achieves at least a $100 \times$ speed advantage over existing methods without compromising accuracy.
On the Posebusters and DockGen benchmarks, our method achieves $52.1$\% and $44.0$% success rates ($\text{RMSD}<2A$), respectively.
The average inference time is $0.04$ seconds on a single GPU, highlighting RapidDock's potential for large-scale docking studies.
We examine the key features of RapidDock that enable leveraging the transformer architecture for molecular docking, including the use of relative distance embeddings of $3$D structures in attention matrices, pre-training on protein folding, and a custom loss function invariant to molecular symmetries. We make the model code and weights publicly available.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11173
Loading