Abstract: The prediction of equipment failures in the manufacturing industry, through the estimation of Remaining Useful Life (RUL), is crucial for minimizing downtime and optimizing maintenance planning. However, developing effective RUL prediction models faces challenges, particularly in capturing long-term dependencies in time series data. Traditional Transformer approaches, using multi-head self-attention mechanisms, suffer from high computational complexity and insensitivity to local regions. To address these issues, we propose a novel approach called Fusion-Attention Transformer (FAT) for RUL prediction. Our model integrates two key components: multi-head Logsparse Self-Attention (LSA) and multi-head Auto-correlation Self-Attention (ASA). LSA employs a logarithmic function and a local sparse strategy, reduces computation, and enhances local information. ASA mainly analyzes the seasonality, which reveals periodic fluctuations in the time series. The combined features of LSA and ASA are then fed into a feed-forward neural network for RUL prediction. Extensive experiments on public datasets, reveal that Feature Attention Transformer (FAT) surpasses various state-of-the-art (SOTA) methods. The findings emphasize the enhanced performance achieved by combining different self-attention mechanisms compared to utilizing them individually.
Loading