Attribute encoding transformer on unattributed dynamic graphs for anomaly detection

Published: 2025, Last Modified: 20 Jan 2026Int. J. Mach. Learn. Cybern. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Dynamic graphs represent connections in complex systems changing over time, posing unique challenges for anomaly detection. Traditional static graph models and shallow dynamic graph methods often fail to capture the temporal dynamics and interactions effectively, limiting their ability to detect anomalies accurately. In this work, we introduce the Attribute Encoding Transformer (AET), a novel framework specifically designed for anomaly detection in unattributed dynamic graphs. The AET integrates advanced encoding strategies that leverage both spatial and historical interaction data, enhancing the model’s ability to identify anomalous patterns. Our approach includes a Link Prediction Pre-training methodology that optimizes the transformer architecture for dynamic contexts by pre-training on link prediction tasks, followed by fine-tuning for anomaly detection. Comprehensive experiments on four real-world datasets demonstrate that our framework outperforms the state-of-the-art methods in detecting anomalies, thereby addressing key challenges in dynamic graph analysis. This study not only advances the field of graph anomaly detection but also sets a new benchmark for future research on dynamic graph data analysis.
Loading