TL;DR: Spiking Fourier Network enables efficient long-term prediction while significantly reducing energy consumption.
Abstract: Spiking Neural Networks (SNNs) have demonstrated remarkable potential across many domains, including computer vision and natural language processing, owing to their energy efficiency and biological plausibility. However, their application in long-term prediction tasks remains underexplored, which is primarily due to two critical challenges: (1) current SNN encoding methods are unable to effectively encode long temporal information, leading to increased computational complexity and energy consumption; (2) though Transformer-based models have achieved state-of-the-art accuracy in temporal prediction tasks, the absence of proper positional encoding for spiking self-attention restricts Spiking Transformer from effectively utilizing positional information, resulting in performance degradation. To address these challenges, we introduce an attention-free framework, **Spik**ing **F**ourier Network (**SpikF**), that encodes input sequences in patches and employs an innovative frequency domain selection mechanism to effectively utilize the sequential properties of time-series data. Extensive evaluations on eight well-established long-term prediction datasets demonstrate that SpikF achieves an averaged $1.9\\%$ reduction in Mean Absolute Error (MAE) compared to state-of-the-art models, while lowering total energy consumption by $3.16\times$. Our code is available at https://github.com/WWJ-creator/SpikF.
Lay Summary: Spiking Neural Networks (SNNs), inspired by the brain’s efficiency, are emerging as a promising energy-efficient AI paradigm that communicates via spikes. However, their ability to model long-term dependencies remains limited—they struggle to remember and predict distant future events due to inefficient long-sequence encoding and the lack of an effective mechanism to track temporal context.
To overcome these limitations, we introduce Spiking Fourier Network (SpikF), a novel model that processes long input signals by decomposing them into smaller patches. Leveraging a frequency-based selection method, SpikF captures underlying temporal patterns without relying on computationally expensive attention mechanisms.
Evaluated on eight prediction benchmarks, SpikF outperforms state-of-the-art methods, achieving $1.9\%$ higher accuracy on average while consuming $3.16\times$ less energy. This breakthrough paves the way for ultra-efficient, brain-inspired AI in applications like weather forecasting, stock trend prediction, and beyond.
Link To Code: https://github.com/WWJ-creator/SpikF
Primary Area: Deep Learning->Sequential Models, Time series
Keywords: Time-series Forecasting, Spiking Neural Networks, Fourier Transform
Submission Number: 4311
Loading