DiffGraphTrans: A Differential Attention-Based Approach for Extracting Meaningful Features of Drug Combinations

Published: 06 Mar 2025, Last Modified: 18 Apr 2025ICLR 2025 Workshop LMRLEveryoneRevisionsBibTeXCC BY 4.0
Track: Tiny Paper Track
Keywords: representation learning, Transformer, drug combination prediction, attention mechanism, bioinformatics
TL;DR: We propose DiffGraphTrans, a differential attention - based graph Transformer. It suppresses molecular noise, amplifies key functional groups, and outperforms baseline models in drug combination prediction accuracy and interpretability.
Abstract: Predicting synergistic drug combinations is critical for treating complex diseases, yet existing graph-based methods struggle to balance noise suppression and interpretability in molecular representations. Specifically, the heterogeneity of molecular graphs causes Transformer-based models to amplify high-frequency noise while masking low-frequency signals linked to functional groups. To address this, we propose the Differential Graph Transformer (DiffGraphTrans), which integrates a learnable differential filter into multi-head attention. Our model dynamically suppresses irrelevant atomic interactions and amplifies key functional groups. Experiments on lung cancer drug combinations show that DiffGraphTrans outperforms baseline models and significantly improves biochemical interpretability through attention weight analysis. Our framework provides a principled approach to learning robust embeddings based on noise and biologically meaningful, advancing interpretable AI for drug discovery.
Attendance: Qi Wang
Submission Number: 103
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview