Log-Based Anomaly Detection with Multi-Head Scaled Dot-Product Attention Mechanism

Published: 01 Jan 2021, Last Modified: 06 Feb 2025DEXA (1) 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Anomaly detection is one of the key technologies to ensure the performance and reliability of software systems. Because of the rich information provided by logs, log-based anomaly detection approaches have attracted great interest nowadays. However, it’s time-consuming to check the large amount of logs manually due to the ever-increasing scale and complexity of the system. In this work, we propose a log-based automated anomaly detection approach called LogAttention, which embeds log patterns into semantic vectors and subsequently uses a self-attention based neural network to detect anomalies in the log pattern sequences. LogAttention has the ability to capture contextual and semantic information in the log patterns and to attend far more long-range dependencies in the log pattern sequence. We evaluate LogAttention on two publicly available log datasets, and the experimental results demonstrate that our proposed approach can achieve better results compared to the existing baselines.
Loading