EpiFormer: a transformer-based multi-relational equivariant graph neural network for antibody-aware epitope prediction
Keywords: Geometric deep learning, Equivariant graph neural network, Self-supervised pretraining, Transformers, Antibody design, Protein-protein interaction, Epitope prediction
Abstract: Antibodies are essential components of the immune system, neutralizing foreign antigens such as viruses by binding to specific regions called epitopes. Computational prediction of epitopes is critical for antibody design and therapeutic development. Current approaches for epitope prediction still remain challenging due to: (1) lack of sophisticated architectures to model the complex interaction patterns; (2) ineffective protein representations; (3) antibody‑agnostic modeling despite antibody specificity; (4) severe class imbalance; and (5) scarcity of known antigen–antibody complexes. In order to overcome these challenges, we propose EpiFormer, an encoder-decoder-based architecture that utilizes an E(3)-equivariant multi-relational graph neural network (GNN) coupled with cross-attention to model antigen-antibody interactions. Our contributions are an E(3)-equivariant multi‑relational GNN, a Transformer‑style cross‑attention mechanism, and tailored losses for severe class imbalance and data scarcity. Our method significantly outperforms existing baselines on the Antibody-specific Epitope Prediction (AsEP) dataset by achieving an overall 1.7x performance improvement on multiple classification metrics. This work advances the state-of-the-art in antibody-aware epitope prediction, providing a robust framework for therapeutic antibody design and vaccine development.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 23067
Loading