A lightweight model using frequency, trend and temporal attention for long sequence time-series prediction

Published: 01 Jan 2023, Last Modified: 19 Jun 2024Neural Comput. Appl. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Although deep learning makes great success in increasing the accuracy for long sequence time-series forecasting, its complex neural network structure, which comprises many different types of layers and each layer containing hundreds and thousands of neurons, challenges the computing and memory capability of embedded platforms. This paper proposes a lightweight and efficient neural network called TTFNet, which forecasts long time series using three types of features (i.e., the Trend, Temporal attention, and Frequency attention) extracted from raw time series. In TTFNet, we perform a pooling operation on the historical data in a recent time window to extract a general trend, use a multi-layer perceptron to discover the temporal correlation between data as temporal attention, and apply the fast Fourier transforms on data to obtain frequency information as frequency attention. Each feature is separately extracted from its neural network branch with an output result, and we weight the three results to generate the final prediction while optimal weights are learnt. Also, the three prediction results can run in parallel since they are independent from one another. The experimental results show that the proposed method reduces the memory overhead and runtime by 62% and 81% of the five counterpart methods on average while achieving comparable performance.
Loading