Normality Calibration in Semi-supervised Graph Anomaly Detection

ICLR 2026 Conference Submission19076 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Anomaly Detection, Semi-supervised Learning, Graph Representation Learning, Anomaly Detection
TL;DR: We propose GraphNC, a graph normality calibration framework that leverages both labeled and unlabeled data to calibrate the normality from a teacher model.
Abstract: Graph anomaly detection (GAD) has attracted growing interest for its crucial ability to uncover irregular patterns in broad applications. Semi-supervised GAD, which assumes a subset of annotated normal nodes available during training, is among the most widely explored application settings. However, the normality learned by existing semi-supervised GAD methods is limited to the labeled normal nodes, often inclining to overfitting the given patterns. These can lead to high detection errors, such as high false positives. To overcome this limitation, we propose $GraphNC$, a graph normality calibration framework that leverages both labeled and unlabeled data to calibrate the normality from a teacher model (a pre-trained semi-supervised GAD model) jointly in anomaly score and node representation spaces. GraphNC includes two main components, anomaly score distribution alignment ($ScoreDA$) and perturbation-based normality regularization ($NormReg$). ScoreDA optimizes the anomaly scores of our model by aligning them with the score distribution yielded by the teacher model. Due to accurate scores in most of the normal nodes and part of the anomaly nodes in the teacher model, the score alignment effectively pulls the anomaly scores of the normal and abnormal classes toward the two ends, resulting in more separable anomaly scores. Nevertheless, there are inaccurate scores from the teacher model. To mitigate the misleading by these scores, NormReg is designed to regularize the graph normality in the representation space, making the representations of normal nodes more compact by minimizing a perturbation-guided consistency loss solely on the labeled nodes. Through comprehensive experiments on six benchmark datasets, we show that, by jointly optimizing these two components, GraphNC can 1) consistently and substantially enhance the GAD performance of teacher models from different types of GAD methods and 2) achieve new state-of-the-art semi-supervised GAD performance.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 19076
Loading