An Interactive Attention Mechanism Network Integrating the C¹ Activation Function for Time Series Forecasting

Published: 2025, Last Modified: 10 Jan 2026IEEE Internet Things J. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Decomposing time series into odd and even component sequences is an effective method in time series analysis. However, this data partitioning sometimes leads to the weakening or even disappearance of local features in the original sequence within the odd and even component sequences, thereby reducing the accuracy of the model. To address this issue, we propose a novel neural network with an interactive attention mechanism in this article. In order to allow the odd and even component sequences obtained after decomposition to capture more global information from the time series and compensate for the lost local features, we introduce odd-even fusion components. Through an interactive attention mechanism, the information of the odd component sequence, even component sequence, and odd-even fusion component sequence complement each other, yielding feature subsequences with different temporal relationship weights. Furthermore, we introduce an improved spatial attention submodule with $C^{1}$ activation functions to better preserve local feature mappings. The segmented polynomial curve $C^{1}$ activation function $PP(x)$ not only incurs low computational overhead but also effectively alleviates the vanishing gradient problem, resulting in improved feature recognition capabilities. The $C^{1}$ functional characteristics ensure continuity during backpropagation, guaranteeing stability during the model training process. Experimental results on multiple real-world datasets demonstrate the superior predictive and generalization capabilities of our model for time series forecasting (TSF) tasks.
Loading