Keywords: Efficient Multi-Scale Graph Anomaly Detection via Bidirectional Contrastive Knowledge Distillation
TL;DR: a model compression technique where a smaller student model learns to mimic a larger teacher model
Abstract: Graph Anomaly Detection (GAD) faces the challenge of identifying irregular patterns across multiple structural scales while maintaining computational efficiency for real-world deployment. Existing knowledge distillation approaches rely on unidirectional teacher-student alignment, producing brittle embeddings that fail to establish robust decision boundaries between normal and anomalous patterns. We introduce \textsc{ReCoDistill}, a unified framework that combines bidirectional contrastive learning with progressive checkpoint-based distillation using a single teacher network. Our approach simultaneously optimizes two complementary objectives: (1) attracting student embeddings toward clean teacher representations while (2) repelling them from structured multi-scale noisy teacher outputs. We develop a dynamic curriculum mechanism that selects optimal teacher checkpoints based on complexity-compatibility trade-offs, progressing from local to global semantics. Unlike existing methods requiring multiple teacher networks, \textsc{ReCoDistill} achieves superior efficiency through single-teacher architecture while maintaining state-of-the-art performance. Evaluation on 14 benchmark datasets demonstrates that \textsc{ReCoDistill} achieves the best detection accuracy (88.93\% AUROC on Amazon, 89.80\% on BM-MN), superior zero-shot transfer performance across 9 out of 12 cross-task scenarios, and substantial computational efficiency improvements. Our theoretical analysis provides convergence guarantees and generalization properties, establishing \textsc{ReCoDistill} as the first computationally efficient GAD framework to unify bidirectional contrastive learning with progressive distillation.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 8323
Loading