Graph Attention with Knowledge-Aware Domain Adaptation for Drug-Target Interaction Prediction

ICLR 2026 Conference Submission9880 Authors

17 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Attention Network, Knowledge-Aware Network, Domain Adaptation
Abstract: Predicting drug-target interactions (DTIs) under domain shift is a central challenge in data-driven drug discovery. In this context, we suggest DTI-DA, a practical framework which combines (i) a Graph Attention Network (GAT) for compound encoding, (ii) a Knowledge-Aware Network (KAN) for injecting prior chemical and biological relations into representation learning and (iii) domain adaptation (DA) with the help of maximum mean discrepancy with adversarial domain discrimination. The subsequent system is end-to-end, modular and a repeatable process. In particular, we differentiate two tracks of reporting, that is, source only (no access to unlabeled target data for any method) and transductive UDA (unlabeled target examples aiding the distribution alignment while target labels are always strictly hidden). Beginning comparators are reported in parallel to contextualise performance improvements. We do not make claims of statistical significance and all numbers are treated as single-run point estimates at a fixed protocol. Minor differences between runs (with an AUC of 0.744 in the primary comparison vs. 0.7452 in ablation) are the result of different runs with the same parameter settings (and are irrelevant for the conclusion) and are left in for the sake of fidelity to the runs. Under the mentioned settings, DTI-DA also rivals the performance of strictly classical machine-learning baselines like SVM and RF, as well as widely-known deep baselines like GraphDTA and MolTrans, on BioSNAP and BindingDB. For example, on BioSNAP we have an AUC of 0.744 and an AUPR of 0.757, which calculates to a relative improvement of 0.895%.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 9880
Loading