BOND: Benchmarking Unsupervised Outlier Node Detection on Static Attributed GraphsDownload PDF

04 Jun 2022, 00:57 (modified: 15 Oct 2022, 20:01)NeurIPS 2022 Datasets and Benchmarks Readers: Everyone
Keywords: Graph Mining, Graph Neural Networks, Outlier Detection, Benchmark
TL;DR: We present BOND, a comprehensive benchmark for unsupervised node outlier detection on attributed static graphs.
Abstract: Detecting which nodes in graphs are outliers is a relatively new machine learning task with numerous applications. Despite the proliferation of algorithms developed in recent years for this task, there has been no standard comprehensive setting for performance evaluation. Consequently, it has been difficult to understand which methods work well and when under a broad range of settings. To bridge this gap, we present—to the best of our knowledge—the first comprehensive benchmark for unsupervised outlier node detection on static attributed graphs called BOND, with the following highlights. (1) We benchmark the outlier detection performance of 14 methods ranging from classical matrix factorization to the latest graph neural networks. (2) Using nine real datasets, our benchmark assesses how the different detection methods respond to two major types of synthetic outliers and separately to “organic” (real non-synthetic) outliers. (3) Using an existing random graph generation technique, we produce a family of synthetically generated datasets of different graph sizes that enable us to compare the running time and memory usage of the different outlier detection algorithms. Based on our experimental results, we discuss the pros and cons of existing graph outlier detection algorithms, and we highlight opportunities for future research. Importantly, our code is freely available and meant to be easily extendable:
Supplementary Material: pdf
Dataset Url:
License: BSD-2-Clause license
Author Statement: Yes
Contribution Process Agreement: Yes
In Person Attendance: Yes
22 Replies