Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks

Published: 01 Jan 2023, Last Modified: 17 May 2025ICANN (10) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Artificial Neural Networks (ANNs) have recently shown surprising results in Natural Language Processing (NLP) tasks. However, high energy consumption has become a major drawback of ANNs in NLP applications, which is contrary to the goal of sustainable and efficient computation. In this paper, we propose a novel energy-efficient sentiment classification model based on Spiking Neural Networks (SNNs), which achieves high energy efficiency by exploiting the sparsity of neural activity and using spikes to encode and transmit information. Unlike conventional neural networks that perform continuous and intensive computation, SNNs only fire spikes when they receive sufficient input stimuli, thereby reducing memory and computational overhead. We evaluate our model on the IMDB movie review dataset for sentiment classification tasks. The experimental results show that compared with the current state-of-the-art Transformer model, the energy consumption of the spike encoder model is reduced to 1.36% of the former, which is a 64.93-fold improvement in energy efficiency ratio. Furthermore, our model maintains an acceptable performance variance of 2%. Our research advances the field of “high-performance NLP models” and promotes further exploration of “low-energy NLP models”.
Loading