SNN-FT: Temporal-Coded Spiking Neural Networks for Fourier Transform

Shuai Wang, Haorui Zheng, Yukun Chen, Ammar Belatreche, Guoqing Wang, Yeying Jin, Jibin Wu, Malu Zhang, Yang Yang, Haizhou Li

Published: 01 Jan 2025, Last Modified: 13 Nov 2025IEEE Transactions on Neural Networks and Learning SystemsEveryoneRevisionsCC BY-SA 4.0
Abstract: The Fourier transform (FT) stands as a fundamental tool in modern signal processing with widespread applications across various scientific and engineering fields. Therefore, there remains a need for continued research efforts to devise energy-efficient implementations of the FT. Due to their inherent energy efficiency, biologically plausible spiking neural networks (SNNs) emerge as a promising alternative solution. However, current SNN implementations of the FT suffer from two key shortcomings, namely, high latency and reduced accuracy. In this article, we analyze the underlying causes of these limitations and highlight deficiencies in the existing spike-based encoding mechanisms and spiking neuron models. We then propose a new SNN-based FT (SNN-FT) based on a logarithmically polarized time-to-first-spike (TTFS) encoding method (called LP-TTFS) along with a novel piecewise spiking neuron (PTSN) model based on ternary spikes (referred to as PTSN). The resulting SNN-FT is mathematically equivalent to the conventional FT and demonstrates superior performance in accuracy as well as reduced latency. We assess the performance of the proposed SNN-FT alternative through extensive experiments on FT-based applications, such as radar and audio signal processing, and the obtained results demonstrate the efficacy of SNN-FT and its superiority over the existing approaches. This study unveils a novel energy-efficient neuromorphic computing technique with great potential for FT applications across diverse scientific and engineering domains.
Loading