A Data-centric Framework to Endow Graph Neural Networks with Out-Of-Distribution Detection Ability

Published: 01 Jan 2023, Last Modified: 11 Feb 2025KDD 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Out-of-distribution (OOD) detection, which aims to identify OOD samples from in-distribution (ID) ones in test time, has become an essential problem in machine learning. However, existing works are mostly conducted on Euclidean data, and the problem in graph-structured data remains under-explored. Several recent works begin to study graph OOD detection, but they all need to train a graph neural network (GNN) from scratch with high computational cost. In this work, we make the first attempt to endow a well-trained GNN with the OOD detection ability without modifying its parameters. To this end, we design a post-hoc framework with Adaptive Amplifier for Graph OOD Detection, named AAGOD, concentrating on data-centric manipulation. The insight of AAGOD is to superimpose a parameterized amplifier matrix on the adjacency matrix of each original input graph. The amplifier can be seen as prompts and is expected to emphasize the key patterns helpful for graph OOD detection, thereby enlarging the gap between OOD and ID graphs. Then well-trained GNNs can be reused to encode the amplified graphs into vector representations, and pre-defined scoring functions can further convert the representations into detection scores. Specifically, we design a Learnable Amplifier Generator (LAG) to customize amplifiers for different graphs, and propose a Regularized Learning Strategy (RLS) to train parameters with no OOD data required. Experiment results show that AAGOD can be applied on various GNNs to enable the OOD detection ability. Compared with the state-of-the-art baseline in graph OOD detection, on average AAGOD has 6.21% relative enhancement in AUC and a 34 times faster training speed. Code and data are available at https://github.com/BUPT-GAMMA/AAGOD.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview